ARC Discovery Projects Workshop Faculty of Science. Professor Helena Nevalainen , ARC College of Experts, BSB Panel Professor Bill Griffin, ARC College of Experts, PCE Panel 10 November 2009. ARC Discovery Projects and Fellowships. Overview: some statistics to begin
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
ARC Discovery Projects WorkshopFaculty of Science Professor Helena Nevalainen, ARC College of Experts, BSB Panel Professor Bill Griffin, ARC College of Experts, PCE Panel 10 November 2009
ARC Discovery Projects and Fellowships Overview: some statistics to begin Process: the black box revealed Track record: we believe you (mostly) Early Career Researchers/Fellowships: where are you in your career path? Body of proposal: logical, exciting, a good story Budget tips: thrifty is trendy in the ARC (covered in January Workshop) Rejoinders: you must write one; short, not personal, opportunity to update
ARC Panels BSB - Biological Sciences & Biotechnology EE - Engineering & Environmental Sciences HCA - Humanities & Creative Arts MIC - Mathematics, Information & Communication Sciences PCE - Physical Chemical & Earth Sciences SBE - Social, Behavioural & Sciences
Discovery Project Proposal Selection Criteria Criteria Weighting _______________________________________________________________ - Researcher Track Record & capacity to undertake 40% the research - Proposal Project Content - Project Significance & Innovation 30% - Project Approach 20% - National Benefit 10% __________________ 100%
What happens to your Proposal? - Each College of Experts member reviews >100 DP applications. Each grant is read by 2 College of Experts members. - ARC Executive Director takes care of particular panel, plus other schemes. Assigns Ozreaders to particular grants taking advice from College of Experts (EAC) members. But this is done mainly on the codes and keywords you provide! - Ozreader = discipline expert drawn from pool, reviews ≤20 applications, Assigned by ARC Executive Director - Intreaders = real experts (can be for specific aspects of applications), reviews ≤6 applications.
What happens to your proposal? • You get comments from readers, write a rebuttal • Panel members see each other’s scores, and the rebuttals, adjust to minimise differences • ARC computes WAPR (rankings) • August meeting -- start with highest WAPR and work down until $$$ are gone -- best funding for highest-ranked proposals • Winners notified -- October/November
Ranking Each reviewer’s weighted score is tallied DP=(TR*0.4) + (S/I*0.3) + (Appr*0.2) + (NB*0.1) LP=(TR*0.2) + (ISI*0.25) + (Appr*0.2) + (NB*0.1) + (Commit*0.25) Applications ranked 1 to N based on weighted scores (N = total number of grants reviewed by reviewer) Application rank is converted to percent rank Application percent rank is weighted according to the number of applications reviewed by the respective reviewers and a weighted average is calculated (WAPR).
Recommendation to the Minister EAC 2 $? selection meeting discussion ± reranking panel ED applicant research branch ARC EAC 1 Comments only scores and ranking rejoinder Budget considered in detail at this point
A Discovery Project Application • Proposal is written like a top journal article from Introduction to • end of the Method section (include method analysis) • Is an academic argument on how to advance the academic • field/knowledge to prove this is a significant idea(s) • Has specific, consistent, meaningful research objectives; • Research Question/hypotheses derived from a critical literature • review; RQ’s matched to studies proposed in the Approach • Proposal should be consistent with scoring of selection criteria: • - Track Record (eg carefully compose team), S&I (eg do a crit. lit review), • Approach (eg specific & matched to research questions or hypotheses), & • National Benefit (application of the results) • Do multiple drafts, get feedback, do a pilot study – needs to be • written very well so it is good relative to the best • Based on Funding Rules & Instructions to Applicants • http://www.arc.gov.au/ncgp/dp/dp_fundingrules.htm
Discovery Project Application Form PART A – Administrative Summary A2 – Proposal Title A5 – Summary of Proposal (100 words) Summary of National/Community Benefit (100 words) A6 – Keywords, Research Classifications PART B - Personnel, including Track Record and Research Record Relative to Opportunity PART C – Budget PART D – Report on any current ARC Projects PART E – Project Description
Proposal Title & 100 Word Summary Think of a catchy title, eg. “Body fluids: sweet protection against infection?” Two 100-word summaries: “Abstract” that needs to give a clear idea of what is proposed and why it is important (outcomes) -- written for a scientist. Summary for propaganda purposes (for use by media when successful ones are announced -- simple language, emphasis on importance and potential outcomes (national benefit)
TRACK RECORD • Based on past achievement • high level of input from broad base of outside sources (eg journals, societies…) • Correlates with other criteria, as well • feasibility (approach) • significance and innovation • TR scores tend to add up, ie weighted to higher track records of CIs and PIs (rather than averaged) • plenty of opportunity to make clear in application
B10 research record relative to opportunity 1 read and follow the instructions most significant contributions to this research field (B10.1) don’t hold back (we will believe you, generally) (but over-the-top is picked up and discounted) focus on your impact directly (narrowly) on research outcomes in this half page, ie how you have changed/moved this area of research
Do not do this: “I have carried out research in area x for 20 years and have published significant papers, and have obtained 20 mill in research funds blah blah.” Do this: I discovered x (see papers 1,2,3) which resulted in an international effort to find y (citations n). I discovered that a results in b such that the previously accepted paradigm was incorrect (papers 4,5,6). This has led to numerous other groups…. The outcomes of my research have resulted in z being used by …… in a commercial……. (evidence, see publications 5,6,7+) B10 research record relative to opportunity 2
B10 research record relative to opportunity 3 Significant publications in last 5 yrs (B10.2) Ensure that authorship role is clear on all publications (account for differences in conventions of discipline, journal, team) do not assume all reviewers will know conventions place explanation in an obvious place Enhance basic information with evidence of impact (think RQF, ERA whatever), succinctly include information on: reviews, sales, other impact of books impact factors, citations (H-index), other impact of articles acceptance rate (if appropriate), other impact of conference papers relevance/impact indices of other publications
B10 research record relative to opportunity 4 Ten best career publications (B10.3) Unlimited space: Provide clear evidence of impact (think again of RQF): number of times publication has been cited, referred to, etc… any type of (positive) editorial reaction to publication practical outcomes at some point it would be worth giving average citation rates compared to average in field etc.
B10 research record relative to opportunity 5 10.4 other evidence of impact and contributions • half page • continue theme begun in 10.1, broadening emphasis to wider recognition of your research record • prizes, awards, patents, experience in industry, editorial boards… • reviewer is good, associate editor (expert panel) is better • member of society is good, officer is better • participant in conference is good, organiser is better • presentations are good, invitations are better • broader recognition of your research (eg consultancies) • place all achievements in context, eg • award given every 5 years • first non-American to receive…
B10 relative to opportunity 6 10.5 other aspects of career…relevant to assessment • half page • use as required • be clear, succinct and reasonable • major illnesses or injuries • relocations • time off for maternity/paternity leave • changes of career, research directions • Other • Can take the opportunity to present reviews of papers recently submitted.
B10 research record relative to opportunity 7 10.6 fellowship supplementary information • one page • fellowships are good additions to applications • follow instructions, addressing all points • Can be at same institution but give strong reasons
No. & success Rate of ECR-OnlyProposals (≤ 5 years since PhD) for 2010
2010 Discovery Project Fellowships Need excellent scores on both (1) Project and (2) Fellow - Australian Postdoc Fellowships (≤3 years since PhD): - 2010 Success Rate = 17.1% - Can be 3 years (100% ARC Salary Support) or 4 years (75%) - Australian Research Fellows/Queen Elizabeth II Fellows - ≤ 8 years since PhD, also ≤ 13 years if previously had ARF/QEII: - 2010 Success Rate = 17.8% - Can be 50% ARC Salary Support. Has a much better success rate - Australian Professional Fellowships - ≤ 13 years since PhD - 2010 Success Rate = 16.3% - Can be 50% ARC Salary Support. Has a much better success rate
Body of proposal Construct for the right audience (ie College of Experts, Oz- readers and Int-readers) Consider that CoE member might not know the field Make it exciting -- but watch out for obvious hype…. Have clear aims and hypotheses linked to approach Use preliminary data (VIP), but make sure that it reproduces well in copies, do not use small fonts Show how your previous research is relevant and how you are leading the field Keep reminding yourself of the assessment weightings:
Rejoinders Always provide one Usually used to discard or reduce weighting of an assessor that may have been too harsh. It does make an impact so construct it carefully. Do not get personal. Can provide additional findings or publications. Update your progress on the topic since submission.
Budget tips No point going for teaching relief (BSB) No point indexing salaries Provide good justification (will reduce degree of cut depending on ranking) -- including the roles of requested Research Assistants, etc MU very good (uniquely?) at providing HDR scholarships -- don’t request them, but put them in as University’s contribution Do not make project absolutely dependent on a large budget. Remember average cut is about 40% (BSB), average budget is about 300-350K over 3 years.