Reviews have two purposes, to improve the product (for the reader, author and editor) and to discriminate its quality (helping the reader and editor). Let your conscience be your guide. Mine says to accentuate the positive for reasons found in my treatment of "Doing Science."
For manuscripts concentrate on improvement. Virtually any manuscript can be published by a persistent author, so you might as well improve matters for readers and literature searches. It quickly becomes obvious from the volume and kind of improvements suggested which papers are not fit for publication in a particular journal and whether they might ever be made so. Do triage first; dont spend time on the little details if the paper is DOA (dead on arrival) or hemorrhaging (leave those details for a post-resurrection review). If the grammar or style is poor, however, do mention it and suggest informal review after pointing out the major revisions needed.
In reviewing proposals, strike a balance between quality discrimination and improvement. I have seen proposals written by reviewers (which does not seem like a good idea to me). But if you cant suggest improvement, how much should you criticize? Try to avoid private comments to the editor or program manager unless public comments would jeopardize your anonymity. My least favorite reviews in past roles as editor and program manager have been glowy public reviews with anonymous reasons given why the paper should not be published or the proposal should not be funded.
In manuscripts, pay close attention to reproducibility. Are any important details missing? Could the design be conveyed more clearly? Would another form of presentation (figure, equation, table or text) help?
Focus heavily on originality. Is it a repeat in a new environment or with a new species? How does it go beyond earlier thought? Under analysis and interpretation, do the data support the conclusions? Would alternative or additional analyses help? Are alternative interpretations excluded by the data? Could an argument be simplified or has it been oversimplified? Could the material be better synthesized?
Is it balanced? Is opinion distinguished clearly from fact? Are all sides of a controversy presented equitably?
Will it have impact (cause enlightenment, draw conclusions with force, be relevant)? How will publication change the way that readers think and act? How durable is the contribution?
Has the author targetted the right audience? Is this journal the right one? Could the material be improved for this audience?
Is it readable? Rate it on clarity. Rate it on style. Evaluate its voice.
A principal criterion is feasibility. Why are you convinced? If you are not, what would it take?
Consider impact. Will it have overall impact on thinking and practice in the field? Will it have impact beyond the field (which can be a liability if the impact outside the program where it is proposed will be bigger than within)? Evaluate impact per dollar. Be sure to take into consideration the number of years and PIs before you remark on high cost. Decide on risk versus reward of the proposed work. High risk when high reward is possible should be exciting.
Evaluate: timeliness (why it is important to do now and why it can be done now), originality, clarity, demonstrated command of the literature and past performance.
Address other specific or general criteria for this proposal competition.
Plotnik, A. 1982. The Elements of Editing. Macmillan, NY. 156 pp.