Experiential Learning in Design Education
Experiential Learning in Design Education
1 Introduction
Today’s industrial design students face increasing challenges in adapting to
AIoT-driven product-service systems and computational design tools. AIoT
(Artificial Intelligence of Things) refers to the integration of artificial intelli-
gence with the Internet of Things to develop intelligent, connected products
and services―such as wearable devices and smart home appliances―which
Corresponding author: are rapidly transforming industry standards (Naeem et al., 2024). Computa-
Hung-Hsiang Wang tional design tools, including algorithmic modeling and AI-assisted generative
wanghh@[Link] design platforms, enable designers to address increasing complexity and
[Link]/locate/destud
0142-694X Design Studies 99 (2025) 101328
[Link]
© 2025 Published by Elsevier Ltd.
1
customization in product development. As these technologies become central
to industrial practice, designers are now expected to possess not only creative
and visual skills, but also computational literacy, data interpretation ability,
and interdisciplinary collaboration experience.
However, foundational design education has not kept pace with these techno-
logical shifts, often emphasizing traditional studio practices without address-
ing students’ preparedness for digital and systemic design contexts (Meyer &
Norman, 2020). This highlights the need to rethink the pedagogical ap-
proaches used in basic design education, particularly in relation to motivation
and self-efficacy. Basic design courses serve as the cornerstone for developing
visual literacy, creative problem-solving, and foundational design thinking.
Historically, these courses have drawn from two dominant traditions: the Bau-
haus model, which emphasizes formal principles such as composition and the
relationship between parts and wholes (Boucharenc, 2006; Ranjan, 2007), and
experiential design approaches, which integrate observation-based learning
with emerging computational technologies (Uysal & Topalo� glu, 2017).
The integration of experiential learning into basic design education was pio-
neered by the Ulm School of Design and the Leeds College of Art in the
mid-20th century. These institutions introduced curricula that combined
empirical observation, mathematical analysis, and scientific inquiry with
design principles derived from Bauhaus ideologies (Bettaieb, 2017; de
Sausmarez, 2001). Leeds emphasized biological deformation and projection
exercises to train perceptual awareness (Hamilton, 1961; Yeomans, 1988),
while Ulm encouraged geometric reasoning and systems thinking (Leopold,
2013; M€antele, 2021; Oswald, 2022; Spitz, 2002; Takayasu, 2017).
Despite these early innovations, experiential methods are still rarely used in
basic design studios, which often prioritize theoretical constructs and aesthetic
doctrines over real-world engagement. Students from vocational back-
grounds, in particular the art and design groups, may find it difficult to connect
these abstract exercises to meaningful design tasks, especially when they lack
experience in digital modeling or algorithmic thinking. They often have
weaker mathematical, computational, and general academic literacy
compared to their academic-track peers (Chang & Lu, 2025; G€ ok &
Erdo�gan, 2017; Lin et al., 2019). This skills gap makes it more difficult for
them to connect abstract exercises to meaningful design tasks and to apply
concepts in practice, contributing to increased anxiety and lower self-efficacy
in design learning environments (Chien et al., 2022). Insufficient experience
with algorithmic or digital modeling has also been empirically linked to
decreased motivation and digital self-efficacy in similar student populations
(Liao et al., 2022; T€ urker & Pala, 2020). These findings underscore the
2
importance of aligning design education with students’ backgrounds and sup-
porting computational skill development.
2 Literature review
2.1 Experiential learning in basic design
The pedagogical legacy of institutions like the Ulm School of Design and
Leeds College of Art continues to shape how experiential learning is applied
in basic design education. These schools modeled a hands-on, perceptually
grounded approach that blended real-world observation with systematic
exploration ― a method still echoed in today’s studio practices (see
Bettaieb, 2017; de Sausmarez, 2001). These educational innovations laid the
groundwork for combining observation, induction, and abstraction in design
learning, and emphasized engaging the senses, training perception, and recon-
structing form through guided experimentation.
3
Contemporary scholars argue that experiential learning supports the develop-
ment of analytical thinking, particularly when integrated with computational
or generative tools (Uysal & Topalo� glu, 2017), yet more empirical studies are
needed to validate and scale such pedagogical models.
Kolb’s (2014) experiential learning cycle (Figure 1), as adapted here, contrasts
with traditional methods. Traditional teaching starts with abstract concepts
and analysis of artworks, while experiential learning begins with concrete
observation. This approach uses personal experiences to build and analyze
concepts, emphasizing direct experience over abstract instruction for devel-
oping reliable design principles. For example, Caner Y€ uksel and Dinç
Uyaro�glu (2021) describe a basic design studio model in which students first
actively engage with physical materials and spatial configurations―learning
through doing―before any formal theoretical content is introduced. In this
way, knowledge is constructed from experience and reflection on action, rather
than beginning with theory and moving toward practice. This stands in clear
contrast to conventional curricula that prioritize learning abstract principles
before allowing students to apply them in hands-on tasks.
Table 1 Experiential learning and traditional learning in basic design (based on Caner Y€
uksel & Dinç Uyaro�
glu, 2021)
4
Figure 1 Traditional learning (left) and experiential learning for basic design (right) (adapted from Kolb, 2014)
5
Recent studies show that high self-efficacy contributes to improved learning out-
comes and academic performance (Hayat et al., 2020; Li et al., 2022). However,
self-efficacy is also influenced by students’ exposure to scaffolded learning expe-
riences that allow for early success and the gradual development of complex
skills (Yusof et al., 2021). Experiential learning frameworks can thus be power-
ful tools for improving both self-efficacy and intrinsic motivation, particularly
when computational or unfamiliar tools are introduced incrementally and sup-
ported through coaching and feedback mechanisms. Experiential learning is
particularly effective in this regard because it actively engages students in
hands-on tasks, provides opportunities for reflection and immediate feedback,
and emphasizes mastery through repeated practice in authentic contexts. These
features foster both a sense of competence and intrinsic motivation, especially
when new or computational tools are introduced incrementally and supported
through coaching (Caner Y€ uksel & Dinç Uyaro� glu, 2021; Hettithanthri &
Hansen, 2022). Thus, experiential learning frameworks can be powerful tools
for improving self-efficacy and motivation in design education.
3 Teaching program
3.1 Course planning
The teaching program was implemented within a Basic Design course offered
to first-year industrial design students at a public university. We drew upon the
BOPPPS model (Bridge-in, Objective, Pre-assessment, Participatory learning,
Post-assessment, and Summary) (Wu et al., 2022) with Kolb’s experiential
learning cycle (2014) to create a structured yet flexible learning environment.
The BOPPPS model was chosen for its systematic approach to lesson plan-
ning, which ensures that each class begins with engagement, clearly communi-
cates objectives, assesses prior knowledge, involves active participation, and
concludes with reflection and assessment. This framework aligns well with
Kolb’s experiential learning cycle, as each BOPPPS component supports a
key phase of experiential learning: “Bridge-in” provides opportunities for con-
crete experience and active engagement; “Objective” specifies the intended out-
comes of learning by experience; “Participatory Learning” facilitates the
process of observation, abstraction, and experimentation; and “Summary”
fosters reflective observation. Additionally, “Pre-assessment” and “Post-
assessment” enable the evaluation of learning improvements. In this way,
6
the BOPPPS model structures the instructional process to mirror and reinforce
Kolb’s cycle of experiential learning. The course was structured over nine
weeks and designed to foster cognitive engagement through observation,
abstraction, and simulation.
Four thematic learning units guided students through experiential tasks rooted
in natural observation: (1) field observation, (2) rhythm, (3) self-similarity, and
(4) gradient form. Each unit began with a guided exploration of nature-based
forms followed by exercises to extract design patterns using sketching and cod-
ing. Structure Synth, an open-source generative design tool, was introduced to
translate observed patterns into parametric simulations. This sequence was de-
signed to ensure that students actively engaged in direct experience, moved
through reflective and abstract analysis, and practiced applying learned con-
cepts in generative modeling―thus aligning with both Kolb’s experiential cy-
cle and the framework of BOPPPS.
1 Bridge-in (Objective/Outcome): Outline learning goals and expected results, including Ulm and
Leeds’ design works, shape grammar, fractals, and Structure Synth examples.
Pre-assessment: Pretest of Motivational Strategies for Learning Scale (MSLQ) and Industrial
Design Self-Efficacy and Anxiety Scale (IDSEAS).
2—3 Participatory Learning: Experiential learning in four units.
Unit 1: Observe nature
Teach field observation methods. Students observe and record three natural objects, identifying
patterns in growth and form, and creating both detailed and abstract sketches.
4 Unit 2: Analyzing the rhythm in nature
Introduce rhythm and shape grammar principles. Students observe and record rhythmic natural
objects, identify and describe rhythmic patterns, and use shape grammar to explain generation rules.
Apply a Structure Synth code template to simulate rhythmic shapes.
5 Unit 3: Self-similarity in nature
Introduce self-similarity and fractal principles. Students observe and record self-similar natural
objects, identify and describe self-similarity principles, and use fractals to express these rules. Apply
a Structure Synth code template to simulate self-similar shapes.
6 Unit 4: Shape gradient in nature
Introduce shape gradient principles. Students observe and record natural objects with gradient
characteristics, identify and describe these principles, and use them as shape grammar or fractals to
represent them. Apply a Structure Synth code template to simulate gradient shapes.
7 Report preparation: Introduce report writing and presentation basics, provide word and PPT
templates. Students compile reports on abstraction, rhythm, self-similarity, and shape gradient.
8 Outcome briefing: Students present their learning outcomes.
Post-assessment: Conduct MSLQ and IDSEAS post-tests and use rubrics for final performance
evaluation.
9 Summary: Review the learning goals and outcomes.
7
To introduce students to generative form-making and rule-based design, the
course incorporated Structure Synth, a lightweight open-source software
that employs a simple declarative programming language to create 3D struc-
tures. Students received a beginner-friendly code template that demonstrated
how to manipulate shape parameters―such as scale, rotation, and color―a-
cross gradients. This template served as a scaffold for exploring how small
rule changes could yield complex formal variations, effectively bridging intui-
tive visual thinking with computational logic. While the coding component
was intentionally minimal to lower technical barriers, it played a critical role
in developing algorithmic thinking and supporting the experiential design
process.
8
and conceptualization, which is essential for scaffolding entry into computa-
tional and generative design thinking.
Learning objectives were aligned with students’ existing art and design knowl-
edge and scaffolded to support a gradual entry into generative rule-based
thinking. The instructional emphasis shifted from sensory observation to sym-
bolic representation through shape grammars and fractals. This progression
helped students connect new computational skills with familiar creative prac-
tices, easing the transition from concrete artistic processes to more abstract
algorithmic reasoning.
Each week concluded with structured critiques and peer feedback sessions to
reinforce conceptual clarity. Instructors provided individualized coaching to
support modeling challenges, particularly during the use of Structure Synth,
which many students initially found difficult. Table 3 shows the details of
the learning Unit 1. These structured feedback and coaching opportunities re-
flected the “Summary” and “Post-assessment” elements of the BOPPPS
model, and reinforced Kolb’s emphasis on reflection and iterative improve-
ment―fostering both self-efficacy and sustained engagement as students built
confidence with new tools.
9
Table 3 Learning unit 1
Objective 1. Grasp the fundamental principles behind the formation of natural shapes.
2. Enhance abilities to observe and document natural patterns and structures.
3. Relate observations from nature to core design concepts.
Context 1. Assigned outdoor Area: The observation area is located around the university grounds.
2. Students will directly observe and document three natural objects (like animals and
plants).
3. Practice recognizing, describing, and summarizing recurring patterns, and create both
detailed sketches and abstract representations of these objects.
Prior knowledge Students possess fundamental design or art skills and knowledge appropriate for the
senior high school level.
Assessment Summative evaluation of participation, field notes, project reports and outcome briefings,
using a scale from 1 to 4, where 4 is "excellent performance" and 1 is "requires
improvement." Assess: Observation, pattern analysis, reconstruction of generative
simulation, and soundness of learning process.
Operational focus Present on-site observation techniques — such as sketching, photography, and note-taking
— during lectures. Apply these methods during fieldwork, with guidance to assist students
in recognizing and analyzing patterns in the structures of animals and plants.
Contents 1. Pre-class materials: Patterns in nature (Covington, 2021)
The golden ratio: Nature’s favorite number (TED-Ed, 2015)
Theory presentation: D’Arcy Thompson and biological form (INKtalks, 2018)
2. In-class guidelines:
Discuss field observation precautions, including skills, tools, and methods. Emphasize
attention to structure, growth, and form in plants and animals to identify patterns. Guide
students in observing campus flora and fauna. Demonstrate proper techniques and
encourage comparison with formal principles of beauty (e.g., repetition, gradient,
symmetry).
3. Homework:
Observe and document three types of animals and plants. Identify and describe recurring
patterns, and create sketches. Ensure to: observe nature outside, photograph and sketch
observations. Identify patterns and proportions (e.g., rhythmic changes), and describe
these patterns accurately.
4. Next class:
Students will present their homework. The teacher will provide individual feedback.
Activities Field observation helps connect theory to real-world patterns in nature. Students should
follow safety precautions and grasp learning objectives. They’ll start with campus
observations to face challenges and apply design principles. Ensure they form groups of
3—5 for homework and understand the requirements.
Time management 1. Watch the reference video for 50 min before class.
2. Attend 50 min of lecture, including 20 min of discussion on the video.
3. Conduct 100 min of campus observation.
4. Homework for next week will take 50 min.
10
Table 4 Rubrics for learning performance
1. Observation Can observe and Can observe 2—3 Can observe 1—2 Has not
accurately sketch 3 types of living types of living demonstrated the
types of living things outdoors things outdoors ability to observe
things outdoors. and create rough and draw basic living things
sketches. realistic sketches. outdoors or create
sketches.
2. Pattern analysis Accurately uses Generally uses Uses only basic Fails to represent
shape grammar or shape grammar or symbols or arrows any shape
fractals to express fractals to represent to depict shape generation rules.
shape generation shape generation generation rules.
rules of observed rules with some
living things. accuracy.
3. Reconstruction of Can use Structure Can use Structure Can use Structure Can barely use
generative Synth to create Synth to create Synth to create Structure Synth,
simulation shapes closely shapes that shapes, though they resulting in shapes
resembling the somewhat resemble resemble the with minimal
observed living the observed living observed living resemblance to the
things. things. things poorly. observed living
things.
4. Soundness of Clearly includes Generally includes Partially includes Lacks a clear
learning process coherent, logically coherent, logically coherent, logically theme, with
rigorous steps from rigorous steps from rigorous steps from incoherent
observation to observation to observation to discussion or
reflection. reflection. reflection. poorly organized
steps.
5. Reflection in Clearly uses Generally uses Partially uses Fails to describe
action to construct pictures and text to pictures and text to pictures and text to achieving the
knowledge describe achieving describe achieving describe achieving learning objective
the learning the learning the learning through reflection
objective through objective through objective through and personal
reflection and reflection and reflection and growth.
personal growth. personal growth. personal growth.
4 Methods
4.1 Experiment design
This study adopted a quasi-experimental “one-group pretest-posttest design”
to evaluate the effectiveness of the experiential learning program. Although
this design is subject to well-documented limitations, it remains a practical
choice in educational research―such as the current study―when random
11
assignment or control groups are infeasible due to ethical and logistical con-
straints (Knapp, 2016). To enhance validity and reliability, validated and
pilot-tested measurement instruments were employed, and the performance
assessment rubric was developed and reviewed by three faculty members to
ensure content validity. Quantitative findings were triangulated with qualita-
tive data from student reflections and peer feedback. The intervention (X)
was the nine-week experiential teaching program, and outcomes were
measured using a combination of pretest (O1) and posttest (O2) instruments,
as illustrated in Table 5. In addition, Table 6 summarizes paired-sample t-test
results comparing pre- and post-test scores for motivation, strategy use, and
overall MSLQ. For motivation (Pair 1), the mean increase of 0.212 was not
statistically significant (t = 1.448, p = .159). Full results for Pairs 2 and 4
are also provided. Significance was set at p<.05.
4.2 Participants
The study involved 29 first-year undergraduate students (11 male and 18 fe-
male) enrolled in the Basic Design course at a public university in Taiwan.
One student withdrew during the intervention due to personal reasons, leaving
Rubrics
Pairwise difference
mean standard deviation standard error t significance (two-tailed)
Note: ∗p<.05, ∗
∗p<.01.
12
28 participants in the final analysis. All participants had received approxi-
mately one to three years of prior training in senior high school-level art or
design programs, focusing primarily on foundational skills such as drawing,
color theory, visual composition, and basic three-dimensional form. Their pre-
vious education was mainly structured around traditional, instructor-led ap-
proaches, with limited opportunities for active or reflective learning. Thus,
the course was designed to bridge the gap between their existing knowledge
gained through conventional methods and the new competencies fostered by
experiential learning―namely, engaging directly with materials, reflecting on
design processes, and developing self-efficacy and motivation through active
participation.
4.3 Instruments
Multiple instruments were used to assess the outcomes of the course.
Note that MSLQ and IDSEAS were administered both before and after the
course to evaluate shifts in psychological variables. Learning performance
was assessed only post-intervention. The combination of quantitative assess-
ment of assignments and qualitative analysis of reflective reports provided a
comprehensive understanding of learning outcomes across cognitive, affective,
and creative domains.
13
occurring terms in students’ reflective reports, providing an initial sense of
common themes. Based on this, the author and a research assistant manually
reviewed relevant excerpts and grouped them into broad thematic categories.
These groupings were then interpreted to extract insights into students’
perceived changes in motivation, self-efficacy, and engagement with experien-
tial learning. While the approach was informal, it offered a practical means to
capture emergent patterns in a small-scale classroom setting.
For RQ3, a mediation analysis was conducted using linear regression to deter-
mine whether self-efficacy mediated the relationship between motivation and
learning performance. Following Baron and Kenny’s (1986) three-step model:
The Sobel test was applied to verify the statistical significance of the indirect
(mediated) effect (Baron & Kenny, 1986; Sobel, 1982). This methodological
approach allowed the researchers to move beyond surface-level comparisons
and explore how experiential learning influences learning outcomes through
psychological mechanisms such as self-belief and confidence. For example,
Preacher and Hayes (2004) demonstrated the application of the Sobel test in
educational and psychological research to assess whether changes in self-
efficacy mediated the relationship between instructional interventions and stu-
dent performance. By integrating performance-based assessment with vali-
dated psychological scales, the study ensured both contextual relevance and
methodological rigor.
5 Results
5.1 Student work samples
To provide qualitative insights into student learning, examples of student out-
puts were analyzed across the four thematic units. Pseudonyms were used to
anonymize student identities.
14
Figure 2 Observing (left) the poison bulb and decomposing (right) its abstract structure
Figure 3 Analyzing (left) the rhythm in vines, exploring (middle) its shape grammar, and simulating (right) the rhythm of vines
Figure 4 Observing (left) the cantaloupe, identifying (middle) self-similarity in the netted skin, and exploring (right) generative rules
15
Figure 5 Observing (far left) the zebra plant, writing (left) the Structure Synth code, computer-generated (right) model, mapping (far right)
the model on the photo
If you don’t observe carefully, you won’t discover these magical laws.
(Victoria)
When I squatted down and observed them carefully, I found they are not
only cute but also have rules. (Katherine)
16
5.2.3 From observation to insight
Participants recognized the link between natural forms and mathematical or
design principles. Several reported an evolving ability to perceive order in
complex shapes.
The most challenging part was the modeling software …, but my progress
from ignorance to a rough understanding of the code was significant.
(Katherine)
17
Initially unfamiliar with coding, I gradually understood its logic, and suc-
cessfully building code gave me a strong sense of accomplishment. (Grace)
This course differed from past experiences; observing nature revealed the
simple beauty we often miss and highlighted the importance of such
studies in design. (Sarah)
18
Figure 6 Average learning performance (error bars indicating standard deviations, n = 28)
Figure 7 Comparison of MSLQ pre- and post-test average values (error bars indicating standard deviations, n = 28)
19
5.3.3 Self-efficacy
Unlike learning motivation, students’ self-efficacy improved significantly
across several dimensions of the IDSEAS. Increases were especially pro-
nounced in problem-solving and modeling experiment subscales, indicating
that students gained confidence in their ability to approach and complete com-
plex tasks. Figure 8 shows that mean scores for the four IDSEAS subscales
ranged from 6.527 to 6.708, with standard deviations from .598 to .746. The
class average increased from 6.596 (SD = .494) to 6.988 (SD = .542). The
Cronbach’s alpha for self-efficacy was .869 (pretest) and .846 (posttest),
showing high internal consistency.
The t-test results in Table 7 show a significant increase in the overall IDSEAS
posttest score by .392 points (p = 0.002), a 5.94 % improvement. Standard de-
viations ranged from .047 to .083. Significant improvements were found in
problem solving (p = 0.012, 5.25 %) and modeling experiments (p = 0.044,
5.58 %). Although improvements in information collection (p = 0.184,
2.81 %) and project implementation (p = 0.088, 5.06 %) were not significant,
there was a slight overall enhancement in self-efficacy, with the greatest gains
in problem solving and modeling.
Figure 8 IDSEAS pretest and posttest mean comparison (error bars indicating standard deviations, n = 28)
Note: ∗p<.05, ∗
∗p<.01
∗
∗∗p<.001.
20
5.3.4 Mediating effect of self-efficacy
This study conducted a mediation effect analysis using linear regression, with
Y as learning performance (rubrics score), X as learning motivation strategies
(MSLQ post-test score), and M as self-efficacy (IDSEAS post-test score). The
analysis, detailed in Table 8, followed these steps:
B SE ß R2 Adj. R2 Sig.
Note: ∗p<.05, ∗
∗p<.01.
Figure 9 Mediation model showing how self-efficacy mediates the relationship between learning motivation and performance in computational
design learning. Note: ∗p<.05, ∗
∗p<.01
21
= .153). Since (c’) is smaller and less significant than (c), M shows a
“completed mediation effect.”
The mediation analysis revealed that motivation alone did not directly
improve performance; rather, self-efficacy acted as a key intermediary. In
other words, motivation enhanced performance through its impact on self-
efficacy. Figure 9 illustrates the mediation effect analysis. Learning motivation
strategies positively affect self-efficacy (path a: a = .381, p < .05), and self-
efficacy positively affects learning performance (path b: b = .416, p < .05).
However, the direct effect of learning motivation on performance is not signif-
icant (c’ = .262, p = 0.153). The indirect effect through self-efficacy is signifi-
cant (path a∗b = .159). The Sobel test confirms this mediation effect
(p = .118), as shown in Table 9.
6 Discussion
This section addresses the three research questions guiding the study and inte-
grates recent theoretical perspectives and empirical findings to interpret our re-
sults and their implications for design education.
Note: ∗p<.05, ∗
∗p<.01.
22
technologies. Although students valued the observation of natural patterns,
the cognitive demands of Structure Synth led to frustration. This tension is
well explained by Expectancy-Value Theory (Pintrich & Schunk, 1996), which
posits that motivation arises from both task value and expectancy of success,
and by Cognitive Load Theory (Sweller, 1988), which highlights the negative
impact of extraneous cognitive load―mental effort caused by poorly designed
or unnecessarily complex instructional elements. These findings also align with
Uysal and Topalo�glu (2017), who stressed the importance of representational
training to reduce barriers in computational design education.
23
6.2 Implications for design education
These findings suggest that fostering self-efficacy is more impactful than sim-
ply enhancing motivation in early computational design education. To sup-
port student learning, instructor should scaffold instruction by providing
clear examples (e.g., nature-inspired cases or beginner-friendly code templates
that demonstrate design grammars), timely feedback (e.g., real-time support
via LINE or Google Classroom to address coding challenges and reinforce
progress), and structured opportunities for success (e.g., tiered assignments
that build from simple to complex tasks, or AI tools to reduce coding anxiety
and aid idea translation). Introducing AI-based tools such as ChatGPT―-
which can convert natural language descriptions into executable code―may
lower entry barriers for novice learners; however, the quality and accuracy
of generated code can vary and often require validation or revision with
instructor support. This instructional potential aligns with theories of scaf-
folding and guided learning (Collins et al., 1991) and with Vygotsky’s (1978)
concept of the Zone of Proximal Development, where learners benefit from
structured support just beyond their current capabilities. As illustrated in
Figure 10, natural language interfaces help maintain engagement by allowing
students to visualize rules and receive immediate feedback, reducing frustra-
tion from syntax errors.
Figure 10 Using ChatGPT to write a Python program for self-similarity of tree branches (left) and then execute the program to automatically
draw a tree (right). Note: Top left (translated from Chinese): “You — Write a Python program to draw a tree-like branching structure. Allow
users to set the number of branches. The branching angles should be randomly determined between 2 and 5◦ . The overall tree should be asym-
metrical, and a user interface should be provided.” ChatGPT reply (translated): “ChatGPT — Here is a slightly modified version, so that the
whole tree is not symmetrical.”
24
conceptualized by Vygotsky (1978) ―can help address this challenge. These
tools should not replace critical thinking or creative exploration but should
serve as facilitators that expand access and promote reflective design practice.
7 Conclusion
This study examined the impact of an experiential learning model―grounded
in observation, abstraction, and computational simulation―on first-year in-
dustrial design students’ motivation, self-efficacy, and learning performance.
The core finding is that self-efficacy, rather than motivation alone, drives per-
formance in computationally enhanced design education. Although nature-
based observation stimulated student interest, it was the successful engage-
ment with modeling tools and reflective learning that led to meaningful perfor-
mance improvements. These mastery experiences were facilitated by
structured support, peer collaboration, and instructor feedback.
25
psychological mechanism through which experiential learning influences out-
comes―a finding consistent with Bandura (2010), Pintrich and De Groot
(1990), and Ghbari et al. (2024). It suggests that motivation must be accompa-
nied by competence-building experiences to yield educational benefits.
Generative AI tools such as ChatGPT can serve as scaffolds for novice de-
signers, translating natural language into visual outputs and easing the transi-
tion to computational design. This approach draws on educational theories of
scaffolding and guided learning (Collins et al., 1991) and aligns with
Vygotsky’s (1978) concept of the Zone of Proximal Development, where
learners benefit from structured support just beyond their current abilities.
By reducing technical barriers and enhancing formative feedback, such tools
offer a promising direction for adapting experiential learning to 21st-century
design education.
Funding
This research was funded by Ministry of Education, Taiwan, under grant
number PHA1120191.
Data availability
Data will be made available on request.
26
References
Bandura, A. (2010). Self-efficacy. In The Corsini encyclopedia of psychology (4th
ed.). (pp. 1—3) Wiley. [Link]
Baron, R. M., & Kenny, D. A. (1986). The moderator—mediator variable distinc-
tion in social psychological research: Conceptual, strategic, and statistical con-
siderations. Journal of Personality and Social Psychology, 51(6), 1173—1182.
[Link]
Bettaieb, D. M. (2017). Proposed strategy in teaching design fundamentals for un-
derstanding the relationship between idea and idea’s projection. Art and Design
Review, 5(2), 129—140. [Link]
Boucharenc, C. G. (2006). Research on basic design education: An international
survey. International Journal of Technology and Design Education, 16, 1—30.
[Link]
Brophy, J. (1998). Motivating students to learn. McGraw Hill.
Caner Y€ _ (2021). Experiential learning in basic
uksel, Ç., & Dinç Uyaro�glu, I.
design studio: Body, space and the design process. International Journal of
Art and Design Education, 40(3), 508—525. [Link]
Chang, J., & Lu, P. (2025). Examining vocational education in Indo-Pacific coun-
tries and implications for Taiwan’s vocational education. In The asian confer-
ence on education 2024: Official conference proceedings (pp. 661—670). https://
[Link]/10.22492/issn.2186-5892.2025.59.
Chien, Y. H., Lin, K. Y., Hsiao, H. S., Chang, Y. S., & Chan, S. H. (2022).
Measuring industrial design self-efficacy and anxiety. International Journal of
Technology and Design Education, 32, 1317—1336. [Link]
s10798-020-09648-0.
Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making
thinking visible. American Educator, 15(3), 38—46, 6—11.
Covington, M., Jr. (2021). Patterns in nature [Video]. YouTube. [Link]
[Link]/watch?v=gJqciV7PceA.
de Sausmarez, M. (2001). Basic design: The dynamics of visual form (2nd ed.). Her-
bert Press.
Dorland, A. (2024). Designing our thinking: Examining the effects of experiential
learning and design thinking on creativity, innovation, and collaboration skills
development in the undergraduate classroom. The Canadian Journal for the
Scholarship of Teaching and Learning, 15(1). [Link]
cjsotlrcacea.2024.1.14235.
Fantz, T. D., Siller, T. J., & Demiranda, M. A. (2011). Pre-collegiate factors influ-
encing the self-efficacy of engineering students. Journal of Engineering Educa-
tion, 100(3), 604—623. [Link]
G€ok, T., & Erdo� gan, A. (2017). The impact of vocational high school students’
backgrounds on their mathematics achievements. Eurasia Journal of Mathe-
matics, Science and Technology Education, 13(8), 4987—4994. [Link]
10.12973/eurasia.2017.00977a.
Garcia, T., & Pintrich, P. R. (1994). Regulating motivation and cognition in the
classroom: The role of self-schemas and self-regulatory strategies. In
D. H. Schunk, & B. J. Zimmerman (Eds.), Self-regulation of learning and per-
formance: Issues and educational applications (pp. 127—153). Lawrence Erl-
baum Associates.
Garcia, T., & Pintrich, P. R. (2013). Assessing students’ motivation and learning
strategies in the classroom context: The motivated strategies for learning ques-
tionnaire. In M. Birenbaum, & F. Dochy (Eds.), Alternatives in assessment of
27
achievements, learning processes, and prior knowledge (pp. 319—339). Kluwer
Academic Publishers.
Gencel, I. E., Erdogan, M., Kolb, A. Y., & Kolb, D. A. (2021). Rubric for expe-
riential training. International Journal of Progressive Education, 17(4),
188—211. [Link]
Ghbari, T. A., Badareen, G. S., Al-Smadi, R. T., & Damra, J. K. (2024). The
mediating role of self-efficacy in the relationship between self-determination
motive and academic engagement among undergraduate students. Participa-
tory Educational Research, 11(3), 43—58. [Link]
per.[Link].
Hamilton, R. (1961). About art teaching, basically. Motif, 8(Winter), 17—23.
Hanik, K., Pramono, S. E., Yulianto, A., & Utomo, C. B. (2025). The effect of
self-efficacy on academic performance: The mediating role of adaptation and
motivation in seafaring students. Journal of Ecohumanism, 4(1), 1739—1750.
[Link]
Hayat, A. A., Shateri, K., Amini, M., et al. (2020). Relationships between aca-
demic self-efficacy, learning-related emotions, and metacognitive learning stra-
tegies with academic performance in medical students: A structural equation
model. BMC Medical Education, 20(1). [Link]
01995-9. Article 76.
Hettithanthri, U., & Hansen, P. (2022). Design studio practice in the context of
architectural education: A narrative literature review. International Journal
of Technology and Design Education, 32, 2343—2364. [Link]
s10798-021-09694-2.
INKtalks. (2018). Theory presentation: D’Arcy Thompson and biological form
[video]. YouTube. [Link]
Knapp, T. R. (2016). Why is the one-group pretest—posttest design still used?
Clinical Nursing Research, 25(5), 467—472. [Link]
1054773816666280.
Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and
development (2nd ed.). FT Press.
Leopold, C. (2013). Precise experiments: Relations between mathematics, philos-
ophy and design at Ulm school of design. Nexus Network Journal, 15(2),
363—380. [Link]
Li, X., Pu, R., & Nutteera, P. (2022). The influence of achievement motivation on
college students’ employability: A chain mediation analysis of self-efficacy and
academic performance. Frontiers in Psychology, 13, 972910. [Link]
10.3389/fpsyg.2022.972910.
Liao, C. H., Chiang, C. T., & Chen, I. C. (2022). Exploring the relationship be-
tween computational thinking and learning satisfaction for non-STEM college
students. International Journal of Educational Technology in Higher Education,
19(1). [Link] Article 43.
Lin, K. Y., Chien, Y. H., & Hsiao, H. S. (2019). Developing computational
thinking through integration of design and technology education. International
Journal of Technology and Design Education, 29(2), 395—420. [Link]
10.1007/s10798-018-9453-4.
Liu, Y., Tantithamthavorn, C., Liu, Y., & Li, L. (2024). On the reliability and ex-
plainability of language models for program generation. ACM Transactions on
Software Engineering and Methodology, 33(5). [Link]
3641540. Article 126.
M€antele, M. (2021). Grundlehre at the Ulm school of design: A survey of basic
design teaching. In A. S. Bessa (Ed.), Form and feeling: The making of
28
concretism in Brazil (pp. 77—88). Fordham University Press. [Link]
10.2307/j.ctv1198zw7.7.
Meyer, M. W., & Norman, D. (2020). Changing design education for the 21st cen-
tury. She Ji: The Journal of Design, Economics, and Innovation, 6(1), 13—49.
[Link]
Naeem, R., Kohtam€ aki, M., & Parida, V. (2024). Artificial intelligence enabled
product—service innovation: Past achievements and future directions. Review
of Managerial Science, 19, 2149—2192. [Link]
00757-x.
Oswald, D. (2022). Cybernetics, operations research and information theory at
the Ulm School of Design and its influence on Latin America. AI & Society,
37, 1045—1057. [Link]
Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated
learning components of classroom academic performance. Journal of Educa-
tional Psychology, 82(1), 33—40. [Link]
Pintrich, P. R., & Schunk, D. H. (1996). Motivation in education: Theory, research
and applications (2nd ed.). Merrill Company.
Preacher, K. J., & Hayes, A. F. (2004). SPSS and SAS procedures for
estimating indirect effects in simple mediation models. Behavior Research
Methods, Instruments, & Computers, 36(4), 717—731. [Link]
BF03206553.
Ranjan, M. P. (2007). Lessons from Bauhaus, Ulm and NID: Role of basic design
in PG education. In V. S. Katiyar, & S. Mehta (Eds.), Proceedings of design
education: Tradition and modernity (pp. 2—9).
Schunk, D. H. (1995). Self-efficacy and education and instruction. In
J. E. Maddux (Ed.), Self-efficacy, adaptation, and adjustment: Theory, research,
and application (pp. 281—303). Plenum Press. [Link]
4419-6868-5_10.
Sinclair, S., & Rockwell, G. (2016). Voyant tools [Web application]. [Link]
[Link].
Sobel, M. E. (1982). Asymptotic confidence intervals for indirect effects in struc-
tural equation models. Sociological Methodology, 13, 290—321. [Link]
10.2307/270723.
Spitz, R. (2002). HfG Ulm: The view behind the foreground: The political history of
the Ulm school of design(1953—1968). Edition Axel Menges.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning.
Cognitive Science, 12(2), 257—285. [Link]
1202_4.
Tuan, H., Chin, C., & Shieh, S. (2005). The development of a questionnaire to
measure students’ motivation towards science learning. International Journal
of Science Education, 27(6), 639—654. [Link]
0950069042000323737.
T€
urker, P. M., & Pala, F. K. (2020). The effect of algorithm education on stu-
dents’ computer programming self-efficacy perceptions and computational
thinking skills. International Journal of Computer Science Education in Schools,
3(3), 1—15. [Link]
Takayasu, K. (2017). Criticism of the Bauhaus concept in the Ulm school of
design. The Journal of the Asian Conference of Design History and Theory, 2,
9—18. [Link]
Talaver, O. V., & Vakaliuk, T. A. (2025). A model for improving the accuracy of
educational content created by generative AI. CEUR Workshop Proceedings
149—158.
29
TED-Ed. (2015). The golden ratio. Nature’s favorite number — TED-Ed [Video].
YouTube. [Link] [Link]
[Link]/watch?v=4TF6mMUe3FYTuan H., Chin, C. & Shieh, S. (2005).
The development of a questionnaire to measure students’ motivation
towards science learning. International Journal of Science Education, 27(6),
639—654.
Uysal, V.Ş., & Topalo�glu, F. (2017). Bridging the gap: A manual primer into
design computing in the context of basic design education. International Jour-
nal of Art and Design Education, 36(1), 21—38. [Link]
jade.12048.
Vygotsky, L. S. (1978). In M. Cole, V. Jolm-Steiner, S. Scribner, & E. Souberman
(Eds.), Mind in Society: Development of higher psychological processes. Har-
vard University Press. [Link]
Wu, C., He, X., & Jiang, H. (2022). Advanced and effective teaching design based
on BOPPPS model. International Journal of Continuing Engineering Education
and Life Long Learning, 32(5), 650—661. [Link]
IJCEELL.2022.125731.
Yeomans, R. (1988). Basic design and the pedagogy of Richard Hamilton. Journal
of Art & Design Education, 7(2), 155—173. [Link]
8070.1988.tb00434.x.
Yusof, N. S. H. C., Razak, N. F. A., Nordin, N. I., & Zulkfli, S. N. (2021). Self-
efficacy, motivation, learning strategy and their impacts on academic perfor-
mance. International Journal of Academic Research in Business and Social Sci-
ences, 11(9), 451—457. [Link]
30