0% found this document useful (0 votes)
19 views7 pages

Aqa 7993 Wre Jun17

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views7 pages

Aqa 7993 Wre Jun17

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Level 3

Extended Project Qualification


7993 EPQ
Report on the Examination

7993
June 2017

Version: 1.0
Further copies of this Report are available from aqa.org.uk

Copyright © 2017 AQA and its licensors. All rights reserved.


AQA retains the copyright on all its publications. However, registered schools/colleges for AQA are permitted to copy material from this
booklet for their own internal use, with the following important exception: AQA cannot give permission to schools/colleges to photocopy any
material that is acknowledged to a third party even for internal use within the centre.
REPORT ON THE EXAMINATION – LEVEL 3 EXTENDED PROJECT QUALIFICATION – 7993 – JUNE 2017

This summer has seen another increase in entry size for this Level 3 research qualification. It is
very encouraging to note that much excellent centre practice has been seen by moderators and
that a majority of centres was found to have marked projects well in line with the AQA standard. As
usual moderators have been impressed by the inventiveness in choice of topics and the skill
shown by candidates as they develop these topics into fascinating outcomes. We continue to feel
privileged to have access to these outcomes. There are clearly very many hard-working
supervisors and coordinators. We salute the excellent delivery of the qualification and the palpable
benefit to students across the whole ability range.

Unfortunately, there are some centres delivering the qualification who are not successfully
supporting their students and I will point out the various areas of concern expressed by moderators
in this report.

Some centres appear to not understand that using the Production Log is a requirement of the
qualification. A significant number under-used the Log with the minimum of detail, often
supplementing project submissions with other documents. In some cases, centre-devised
paperwork supplanted the AQA Production Log. Some centres submitted brief, hand-written Logs
accompanied by long, typed documents, sent as extra attachments, including diaries and
reflections, which were providing precisely the kind of evidence which should have been entered in
the student Logbook. There was also evidence found of retrospective completion of Logs and this
shows a real misunderstanding of the process nature of this qualification. Little valid assessment
evidence can be found in a Log completed retrospectively.

Project approval was sometimes given less attention than is ideal. We saw approval of unsuitable
working titles, for example proposals were too vague, too broad, invited description or speculation
or opinion, used subjective and/or immeasurable terms, thus making the drawing of meaningful
conclusions difficult. Centres are reminded that a final title does not need to be reached until mid-
project review but that even at ‘working title’ proposal stage the candidate should be proposing
something that will invite genuine focused Level 3 research.

There were a few projects seen that although were not definitely guilty of dual accreditation they
were very close to specifications being studied by the candidate and should not have been
approved. We also saw some projects undertaken by candidates at Universities, typically on an
organised placement, where the topic choice, research material and project direction were not truly
owned by the candidate.

There were a substantial number of centres appearing to dictate a report structure to their
candidates. Please note that the specification does not require reports to consist of, for example:
‘abstract, introduction, literature review, findings/results, discussion, evaluation and conclusion’.
This structure may well suit some research projects, but it does not suit all. Candidates should be
free to choose a report structure that best suits their project.

Similar dictation from some centres was seen with respect to primary research. I would like to
stress that whilst in some cases the collection and analysis of primary data is of great value within
a project, this is not always the case. There is no requirement in the specification for primary data
collection. Many projects are very successfully undertaken using only secondary research.

Artefact projects continue to be very variable. At the top end, we have seen truly superb projects,
but it appears that some centres are confused about the requirements of the accompanying written
report. Artefact projects are still being submitted from some centres with only a narrative

3 of 7
REPORT ON THE EXAMINATION – LEVEL 3 EXTENDED PROJECT QUALIFICATION – 7993 – JUNE 2017

commentary and no real research base. We have seen projects that do not evidence any of the
design or manufacturing process decisions that the candidate has made. A common approach
seen comprised of a written report about the topic, with the artefact appearing as a standalone
product. The candidates in such cases don’t seem to understand that they should explain and
show how they designed and made the artefact and how research allowed them to get to this point.
There was some evidence that suggested that the artefact route is being seen as a ‘soft option’ by
some centres with weaker candidates. It must be stressed that a large amount of candidate
research is required to do this type of project justice. Moreover, artefact reports should be written in
a formal academic style, developing the skills of referencing, bibliography and academic writing.

Perhaps the greatest success of this moderation round was the number of supervisors who chose
questions carefully to both challenge the student and give the opportunity to enhance evidence, but
please note that evidence of presentation can only be submitted on paper. Many centres are now
providing excellent written records of the Q&A sessions held after presentations in the candidate
production log, Presentation Part B. This can really enhance the assessment evidence. There
were some instances seen however where only evidence of questions asked was provided.
Without writing down the candidate responses to these questions no assessment evidence was
generated from the Q&A session. It is not acceptable to submit audio or video recordings as a
substitute for a written record.

Many centres caused additional work for moderators by either incorrectly adding up the four centre
AO marks, or by entering an incorrect total into the AQA electronic mark submission system. It
would be much appreciated if both addition and mark entry could be carefully checked by centres
for future series.

Very concerning was the increase in number of projects referred for breach of JCQ regulations.
Centres are reminded that interim marking of a student’s report in not permitted.

Marking, as mentioned above, was mostly sound but instances were seen where centres had
placed marks high within mark bands, seemingly paying no attention to the clause in section 2.5.3
of the specification which states that ‘Higher marks at each level may be used where work is
judged to meet the criteria readily, consistently and across different elements of the project’ and
instead seem tempted to award the higher marks if evidence can be found just once or twice.

Some comments from supervisors referred to evidence seen or heard by them during meetings
with the candidate. This is not valid evidence. Marks can only be awarded from the evidence that
has been submitted by the candidate, this includes the Production Log, Product and Presentation
evidence, plus any other relevant appended evidence selected for inclusion by the candidate.

To focus on the assessment objectives:

AO1

There was some exemplary planning and project management seen by moderators. The best
candidates chose appropriate areas of research, recorded initial plans, demonstrated evaluative
monitoring, explained consequent changes and were evaluative throughout; their Production Logs
contained much evidence of application of the process and of the use of higher skills.

There were, however, plenty of Production Logs that lacked detail and were descriptive.

4 of 7
REPORT ON THE EXAMINATION – LEVEL 3 EXTENDED PROJECT QUALIFICATION – 7993 – JUNE 2017

Common features of Production Logs that resulted in little evidence to support high AO1 marks
were –

• Planning that was time-based only


• Little or no recorded monitoring
• Planning of the report being considered before research had been completed or, in some
cases, even started

The principal source of over-marking was placing work in the top band for AO1 on evidence that
was lacking in one or most respects; the reasoning seemed to be that the work was completed so
planning must have taken place.

AO2

As for AO1, there was some exemplary use of resources seen; very detailed and reasoned
research appropriate to the task, detailed bibliographies in accepted formats, clear and
documented evaluation of sources, detailed citation, detailed and critical analyses of sources
leading to clarity about why and how each source had been used. Most projects, of course, fell
short of this ideal.

Common features of projects that provided little evidence to support high AO2 marks were –

• Over-reliance on non-evaluated Internet sources; this continued to be widespread and


rarely commented on the supervisors/coordinators
• No bibliography
• Very extensive bibliographies, but few sources cited
• Bibliography incomplete and poorly formatted
• No recorded evaluation of sources; many candidates mentioned evaluation in their logs but
did not present any evidence of how it had been undertaken nor on which sources
• ‘Evaluation’ interpreted as ‘what did you use the source for’
• Lack of critical analysis; many candidates were placed in the top band without evidence
that material from sources had been analysed
• Referencing erratic or non-existent
• Little compelling evidence that sources had actually been used
• Resources stated and used by some candidates were sometimes below A-level standard
including GCSE textbooks, BBC Bitesize etc.

AO3

The quality of evidence for AO3 was understandably more variable than that for the other AOs.
The best evidence showed clear and reasoned decision-making that was fit for the purpose of the
project and had a clear outcome. Excellent projects included evidence of each of the higher-level
skills being used consistently throughout the work. Where the product was an artefact, it was fit for
purpose and based clearly on research documented in the short report.

There were a number of features commonly found in less strong projects. The proportion of
projects placed in the top band for AO3 was very large indeed; a very significant proportion of
these marks were not supported by the submitted evidence. Issues included:

• Aims, as set out by the candidate, were not fully met


• Plan, as set out by the candidate, was not fully implemented

5 of 7
REPORT ON THE EXAMINATION – LEVEL 3 EXTENDED PROJECT QUALIFICATION – 7993 – JUNE 2017

• Little evidence of decision-making was submitted, frequently this was the case when the
Log was purely descriptive
• Little or no evidence of change or development, again often due to poor completion of the
Log
• AO3 marked only on the ‘essay’; this was quite widespread and demonstrates a
misunderstanding of the assessment criteria
• Artefacts that were not fit for purpose, as proposed by the candidate
• Lack of cohesion, especially when the report was sectionalised; this sometimes resulted in
a lack of synthesis

AO4

In the best reports, we saw application of sound judgement as candidates reached conclusions,
firmly based on the evidence of their research. The best Summary and Reflection pages in the log
book evaluated the project experience fully and showed clearly what benefits the candidate had
drawn from the work. Within some excellent presentations, candidates summarised the outcomes,
justified the conclusions drawn and evaluated the process used.

In all of these cases plenty of AO4 evidence was found.

Unfortunately, some candidates were encouraged to give their own opinions rather than arriving at
conclusions based on sound research. It was also noted that whilst most candidates evaluate their
own learning, not all evaluate the completed product.

Some excellent Taught Skills programmes were seen, with clearly evidenced skill development of
candidates. However, in some cases centres seemed not to understand how vitally important this
aspect of the qualification is. In many cases where candidates submitted evidence of a low quality
it was clear that few skills had been developed.

Each centre has an AQA appointed Project Adviser who can discuss all aspects of specification
delivery, including the Taught Skills programme.

For centres whose marks have been adjusted it is important that an understanding of the AQA
standard for this specification is achieved. Standardisation exemplars can be found in Secure Key
Materials via eAQA and for artefact projects Teacher Online Standardisation (TOLS) is available.
Face to face, full day Teacher Standardisation meetings take place twice a year and are bookable
via the AQA website.

To conclude as I began, I am full of admiration for the staff at most centres who are delivering this
qualification in exemplary fashion. As a moderation team we now look forward to the next round of
moderation in November.

6 of 7
REPORT ON THE EXAMINATION – LEVEL 3 EXTENDED PROJECT QUALIFICATION – 7993 – JUNE 2017

Mark Ranges and Award of Grades

Grade boundaries and cumulative percentage grades are available on the Results Statistics
page of the AQA Website.
Converting Marks into UMS marks
Convert raw marks into Uniform Mark Scale (UMS) marks by using the link below.

UMS conversion calculator

7 of 7

You might also like