ModelFactory: A Matlab/Octave based toolbox to create human body models

ModelFactory: A Matlab/Octave based toolbox to create human body models

[    [ \orgnameOptimization, Robotics & Biomechanics, Institute of Computer Engineering (ZITI), Heidelberg University, \streetBerlinerstr. 45, \postcode69120 \cityHeidelberg, \cnyGermany


Model-based analysis of movements can help better understand human motor control. Here, the models represent the human body as an articulated multi-body system that reflects the characteristics of the human being studied.



We present an open-source toolbox that allows for the creation of human models with easy-to-setup, customizable configurations. The toolbox scripts are written in Matlab/Octave and provide a command-based interface as well as a graphical interface to construct, visualize and export models. Built-in software modules provide functionalities such as automatic scaling of models based on subject height and weight, custom scaling of segment lengths, mass and inertia, addition of body landmarks, and addition of motion capture markers. Users can set up custom definitions of joints, segments and other body properties using the many included examples as templates. In addition to the human, any number of objects (e.g. exoskeletons, orthoses, prostheses, boxes) can be added to the modeling environment.



The ModelFactory toolbox is published as open-source software under the permissive zLib license. The toolbox fulfills an important function by making it easier to create human models, and should be of interest to human movement researchers.

This document is the author’s version of this article.


VerbatimInputfontsize=, frame=lines, framesep=0.5em, rulecolor=, label=, labelposition=topline, commentchar=# \startlocaldefs\endlocaldefs {fmbox} \docheadSoftware addressref=aff1, corref=aff1, ]\initsMS\fnmManish \snmSreenivasa addressref=aff1, ]\initsMH\fnmMonika \snmHarant {abstractbox}

Open Source, Human Models, Articulated Multi-body Modeling, Movement Biomechanics, Joint Kinematics, Segment Inertia

1 Background

Multi-body models can represent the human body segments and the relative movement between them. Coupled with mechanics-based methods, such models can be a valuable asset in the analysis of human movements. Here, we refer to models that describe the limbs as rigid-segments connected via idealized joints. By utilizing mechanics-based methods (e.g. [1, 2]), such models may be used to analyze recorded movements, simulate novel movements, and compute internal body parameters such as joint torques, that may not be easily measured (e.g. [3, 4, 5, 6, 7]).

An important prerequisite is to create models that can accurately match the body proportions and inertial properties of the human subjects. Additionally, models may only be as detailed as necessary in order to answer specific questions. For example, for motions that occur predominantly in the sagittal plane it may be sufficient to model body segment properties only in that plane (e.g [5, 6, 7, 8]). On the other hand, for more complex movements such as balancing [9], or atypical bone geometry [4], it may be necessary to incorporate further model degrees of freedom (DoF). To study human movements it is therefore often necessary to create models of the human body that are specially suited to the motion or the specific subject being considered.

Figure 1: ModelFactory overview: The toolbox components can be divided into core functions related to model setup, file input/output and other utilities; user-provided functions such as custom setups; and the information pertaining to a specific model. The ModelCreator script communicates with these components to generate the model file for visualization and export.

There exist several examples of software frameworks that allow for the analysis of movements using rigid-body models (e.g. RBDL [1], Puppeteer [3], Simbody [10], Simox [11] and the HuMAns Toolbox [12]). However, the creation of models for usage with such softwares is left up to the user. Model creation often involves tedious, manual editing of XML or other formatted files to setup the model kinematics and inertial properties. While this approach may be feasible for a single-use model, for example that of a robot or generic-human that will not be changed frequently, it is unrealistic for modeling large numbers of body types, joint variations and model configurations. For example, an application that would require many models could be a study that involves model-based analysis of experimental data recorded from many subjects (e.g. [7]). In order to accurately analyze the recorded motions, we need to create subject-specific models corresponding to the body shape and proportions of each participant. Another application could be creating model-variations that differ slightly but systematically from each other. For example, that of a human and a human-model variation including a prosthesis [13]. To the best of our knowledge, there exists a gap in software tools that allow users to efficiently create large numbers and variations of subject-specific multi-body human models.

Here, we detail such a model-creation toolbox consisting of scripts written in Matlab/Octave. The collection of scripts, named ModelFactory, allow for a modular and flexible manner to define model configurations, choose from an available set of rules to compute segment properties, and visualize and export the corresponding models. The toolbox also provides functionality to adjust the model kinematics and inertial properties to subject-specific characteristics.

Figure 2: Human scaling algorithms: [Left] Model with torso segmented into upper trunk, middle trunk and pelvis segments. [Right] Model with fused torso. Models shown are derived from the equations specified by de Leva 1996 [14]. Blue circles indicate motion capture markers and red circles indicate points on segments. Also shown are the local coordinate frames of the segments.

2 Implementation

The main part of the toolbox, the ModelCreator, receives input from several components which contributes towards generating the final model (Fig. 1). These components provide mandatory information (e.g. results from scaling algorithm, subject anthropometry), and/or optional information that may be used to further refine the model (e.g. custom segment lengths). In the following, we describe the toolbox components in further detail.

2.1 Dictionary

The basic building blocks for creating a model are available as predefined “descriptors” of commonly used joints, points and constraints. This built-in dictionary facilitates the easy setup of a large variety of models by choosing the right combination of descriptors. The following descriptor types are available as default:

Joint Types

Depending on the movement and body segment under consideration, various combinations of rotational and translational joints may be needed. Individual joint types for each segment may be specified by using the corresponding descriptors. For example the descriptor “Joint_RY” describes a 1 DoF rotational joint about the Y-axis which is defined in the dictionary by the vector, . The first 3 indices of this vector indicate the rotational axes X, Y and Z, and the last three indicate translational axes X, Y and Z. Similarly, “Joint_Root2D_TXTZRY” describes a 3 DoF joint with translational axes along X and Z and rotation about Y. This joint type may be used to describe the floating base joint of a planar model, and is defined as:


with each row of the matrix above corresponding to one of the DoFs. Note that here we follow the spatial vector formulation of Featherstone [2], which is also used in the multi-body dynamics software RBDL [1]. Commonly used joint types are predefined in the dictionary file customSetups/dict/dict_joint_sets.m (see toolbox-folder structure in Fig. 8).


Pre-defined descriptors allow a user to setup typically used points for each segment. Here, segments are rigid-bodies associated to a local coordinate frame (the description of body segments is detailed later in Sec. 2.3). For example, the descriptor “Points_Hand_R_3D” defines the following points on the right hand segment:


where, the vector corresponding to each point defines the position of that point in the local segment coordinate frame, when scaled by the segment length. For the foot segment, we allow further detail to be provided that takes subject anthropometry into account (e.g. recorded values for heel-ankle-offset, foot-width etc from the provided anthropometry). Implementation details of foot points are available in the toolbox documentation. Commonly used points are predefined in the dictionary file customSetups/dict/dict_point_sets.m.

Figure 3: Exoskeletons scaled to the human model may be created using the corresponding functions (samples under customSetups/setups/exo). Shown is the automatic scaling of a 3D exoskeleton to two human models with varying segment proportions.

Point Constraints

Points on a body (human or object) can be constrained to the environment by defining constraint-sets that specify the point name, and the normal along which the constraint should act in base coordinates. For example, in order to constrain a sagittal plane foot segment containing the points “Toe_Sagittal” and “Heel_Sagittal”, we may use the descriptor “ConstraintSet_Foot_Sagittal” which defines the following constraints:


where the subset “FootFlat_Sagittal” constrains the “Heel_Sagittal” point along the X and Z normal directions, and as well the “Toe_Sagittal” point along the Z normal direction. This constraint-subset may be used to rigidly fix the foot to the ground at these points. Similarly, rotation of the foot about the Heel_Sagittal point may be expressed by the “HeelFixed_Sagittal” constraint. In this manner, the constraint-set “ConstraintSet_Foot_Sagittal” may be used in a multi-body simulation to describe the behavior of the foot during walking as a series of heel and toe contacts (e.g. [5]). Commonly used point constraints are predefined in the dictionary file customSetups/dict/dict_constraint_sets.m.

Figure 4: Custom scaling may be used to change the default proportions of segment lengths [Left], or to create subject-specific and asymmetric models [Right]. Also shown is a custom marker setup (blue circles).

2.2 Model Types

Two types of models may be defined; Human and Object. Each of these models may consist of a user-defined number of segments, and each segment derives its common, base properties from the class “class_modelSegment” (defined in core/classes/class_modelSegment.m). The properties of this class include:

  • segment name, segment parent name, segment parent ID

  • segment mass, com and inertia properties

  • segment mesh details for visualization

  • marker names and positions

  • joint type, joint center and axis (relative to parent)

  • segment length (used for scaling purposes in human model)

  • associated points

  • associated constraint sets

2.2.1 Human model

We define scaling algorithms that provide methods to compute the segment kinematics and inertial properties from a person’s anthropometric details. Examples of scaling algorithm are the linear regression equations from de Leva [14], Dempster [15], Jensen [16] and Zatsiorsky [17]. Currently, the scaling algorithms from de Leva and Jensen have been made available as toolbox options. The de Leva scaling provides nominal values for segment lengths, and mass and inertia properties based on subject gender, height and weight. We provide 3 de Leva datasets, the first two of which treat the torso as either one segment or subdivided into 3 segments (pelvis, mid-trunk, upper-trunk), Fig. 2. A third de Leva derived dataset can be used to define models in the sagittal plane where the inertial properties of the left and right limbs are combined. The scaling equations by Jensen are better suited for modeling the body segment properties of children. Jensen’s equation take into account the child’s age and weight, but requires the segment lengths to be provided by the user. Additional scaling algorithms can be defined by the user using the templates provided with the toolbox. Note that all user-defined functions should be made available to the toolbox in the customSetups folder.

In order to use the provided scaling algorithms, the user needs to provide some anthropometric details of the person being modeled. General details such as age, height, weight and gender are then used to compute the nominal proportions of the human body segments in the sagittal plane. Further information about the human, such as pelvis width, distance between hip centers, distance between shoulder centers, and foot offsets may be used to setup specialized joint centers for the shoulders, hips and feet. Anthropometric details are specified in a file with keywords as described below:


[label= Details of the subject anthropometry file, commentchar=%]Subject_Anthropometry

Note that some of the fields in the anthropometry file may not be required for certain model types. For example, if a custom scaling algorithm is defined that can compute hip and shoulder widths, then these may be omitted from the anthropometry file. Similarly, all values pertinent to the transverse plane are unnecessary if one is defining a planar model in the sagittal plane. It is left upto the user to choose the right combination of anthropometric detail and scaling algorithm, with the toolbox providing corresponding checks and error messages in case of missing information.

2.2.2 Object model

Object models can be used to define any number of bodies such as boxes, orthoses, or exoskeletons that may be used in conjunction with the human model. Each object is associated with a setup function that defines the configuration of the individual segments that make up the object. This setup function is analogous to the scaling algorithm used to specify the details of the human body segments. Segment inertial properties for objects may be defined in several ways. First, they may be directly included in the setup function. Second, they may be computed automatically using a mean density per segment and the segment mesh volume. Third, they may be provided directly by the user while defining a specific object. This flexibility in computing inertial properties allows the user to define a wide range of objects and properties.

Inertial properties based on mean-density or user-values can be specified in a file as follows:

Segment1, UseMeanDensity, density, ,,, ,,,
Segment2, UseUserValues, mass, CoM, inertia

If the keyword “UseMeanDensity” is specified, the mean density value in and the mesh volume of the segment is used to compute mass, center of mass and inertia. If “UseUserValues” is specified then the values entered in the file are used, where mass is a positive scalar in kg, CoM is a 3-dimensional vector specifying the center of mass of the segment in the local coordinates, and inertia is a 9-dimensional vector that specifies each row of the inertia matrix. Examples of object mass properties files are available in data/samples/3DHumanExoBox/ModelFiles….

Some examples of object setups are included that specify exoskeletons and box-type weights. For example, setups to create exoskeleton models that correspond to a sagittal plane human model, as well as those for a 3D human model (Fig. 3). Both exoskeleton setups automatically scale the exoskeleton to the human model. Sample setups for creating box objects are provided in the toolbox under customSetups/setups/box (sample object shown in Fig. 5).

Note that currently we only model the kinematics and inertial properties of the objects, and not more detailed aspects such as spring-loaded joints and actuator characteristics (e.g. for exoskeletons). These model enhancements will be the focus of future developments of the ModelFactory toolbox.

Figure 5: ModelFactory Graphical User Interface under Matlab R2017a - Outlined are the parts related to the environment variables, visualization and model export. The values displayed in the human-related boxes may still be modified before model creation.

2.3 Model Description

Both human and object models are described in a file that details the kinematic chain, the number and type of segments, joint types etc. Other than the segment names, all descriptors must correspond to elements in the dictionary, the scaling algorithm (in case of the human model) or the object setup (in case of the object model). An example of the description file for a human model is shown below.


[label= Excerpt of a model description file, commentchar=% , fontsize=, lastline=4]Human_Description

Each line has the following fixed structure:

  1. Name of the segment (may be freely chosen)

  2. Segment type as it is defined in the scaling algorithms or the setup functions for objects

  3. Joint type that is connected to the segment. ’R’ denotes a rotational joint and ’T’ a translation joint. ’X, Y and Z’ are the frontal, transverse and longitudinal axis respectively

  4. Name of the parent segment. The predefined name for the origin is ROOT

  5. Set of points (optional). The points have to be defined in the dictionary “dict_point_sets.m” or made available as custom point definitions

  6. Set of constraints (optional). The constraints have to be included in the dictionary “dict_constraint_sets.m” or made available as custom constraint definitions

Figure 6: Human body meshes as simple geometric elements [Left] or more detailed meshes derived from the MakeHuman [18] software [Right]


2.4 Model Customization

We allow for the customization of model proportions to better match specific individuals. In addition, we allow for definition of custom marker setups and dictionary descriptors.

2.4.1 Custom scaling of human model segments

Human model segments may be scaled to subject-specific values by providing a list of segment lengths and the corresponding segment names. These custom segment lengths, , may for example be measured in an experimental setting. Additional customization values in the transverse plane (e.g hip width and shoulder width) can be provided as part of the subject anthropometry. ModelCreator updates the model kinematics to reflect the provided custom scaling. In addition, the segment mass distributions are proportionally adjusted to match the relative segment lengths. This is done such that the adjusted segment masses sum up to the total body mass provided in the subject anthropometry, as follows:


where, is the number of segments, and denote the custom and default segment masses. and are the corresponding custom and default segment lengths. denotes the total mass of the human, and is computed as:


Figure 4 shows some examples of customized models with different segment lengths and proportions. Segment lengths are specified as a formatted text file containing the segment length and name, as shown below:


[label= Excerpt from human custom segment lengths file, lastline=4]Human_SegmentLengths

2.4.2 Motion-capture markersets

Motion-capture markers may be used to reconstruct the model motion from recorded data with methods such as inverse kinematics based on least-squares optimization (e.g. [3]). Marker definitions corresponding to the VICON PIG markerset [19] may be included in the human model (Fig. 2). This option is controlled via the “Addmarkers” field while setting up the model (detailed later in Sec. 3.1). Alternatively, customized configurations of markers may be used by defining the marker placement on each of the human and object segments. This definition is provided as a formatted text file shown in the example below:

Segment_Pelvis, Cluster, 0.043,
name of the segment the marker type distance between
marker is attached to markers (for Clusters)
Pelvis_1, Pelvis_2, Pelvis_3, , , , -1.0, -0.05, 0.90, 0, 20, 0
marker names translational offset rotational offset

The translational and rotational offset is specified with respect to the origin of the body the marker is attached to. This example sets up a customized flange-type marker cluster and is included in the toolbox under data/samples/ModelFiles _3DHumanCustom/…. There are several marker types available: Marker (one single point), Cluster (consisting of three markers), DoubleCluster (consisting of 2 parallel clusters) (refer to Fig. 4). For further details for specifying custom markers see the toolbox documentation.

2.4.3 Custom dictionary terms

The dictionary terms used to create models may be expanded by adding customized definitions of joint types, points, and point constraints using the templates described in Eqs. (1), (2) and (3), respectively. User-defined dictionary terms should be made available to the toolbox by editing the file customSetups/dict/dict_definitions.m and adding the location of the file containing the custom terminology.

Custom points may be used to include points on segments that are of interest for specific applications (e.g. bony landmarks or mesh corners for collision checking). Custom joints provide a way for users to use their own terminology for defining joints (i.e. different from the spatial vector formulation [2] used as default). The implementation and use of custom joints can be influenced by using the boolean “customJoint” in the corresponding dictionary descriptor. With this functionality, users can for example, define complex joint movements such as the surface-geometry based scapulo-thoracic joint [20]. Note that custom joints could possibly require additional model export functions.

It may also be useful to define custom constraints for use during the computation of multi-body dynamics. Custom constraints may be used by adding the relevant dictionary terms, and defining how they are used in the models and written to the model file. As an example, custom loop constraints have been included in the toolbox. In contrast to the body-to-environment constraints-sets (Sec. 2.1), loop constraints may be used to constrain chosen degrees of freedom between two points on two bodies (e.g. human to object or object to object). For usage of point and loop constraints refer to the publications [5, 6, 7] and the multi-body dynamics library [1].

Figure 7: General procedure for creating human models. Mandatory fields are the anthropometry, model description and choice of scaling algorithm. Object model creation follows a similar procedure.

3 Results and discussion

The ModelFactory toolbox may be accessed via a graphical user interface (GUI) or by using a text-interface. The GUI is available for usage with Matlab®via the script ModelCreator_GUI.m (Fig. 5), and the non-GUI version may be used in Matlab and Octave via the script ModelCreator_noGUI.m. For both GUI and non-GUI versions the same basic set of files are required to setup, create and export models (as described in Sec. 2). All the model description files and various options are listed in a single environment file as detailed in the following.

3.1 Environment file

The environment file consists of sets of keywords followed by string values associated with the keywords. Each environment file is associated with only one human model and any number of object models. Excerpts from an environment file are provided below followed by details about each of the keywords. We start with keywords related to the human model:


[label = Environment details - Human Model, lastline = 23]EnvironmentSetup.env

Most of the keywords are self explanatory and refer to the human model aspects detailed in Sec. 2. With the field “humanModel_TypeMeshes” one can choose the default geometric shapes or more detailed human-like meshes (Fig. 6) derived from the software MakeHuman [18] to visualize the human model. The field “AddMarkers” adds the default markerset to the human model. The object model keywords are as follows:


[label = Environment details - Object Model 1, firstline = 25, lastline = 32]EnvironmentSetup.env

Consecutive objects can be defined by numbering the related object files. Additionally, the following general environment keywords may be defined:


[label = Environment details - General Keywords, firstline = 33, lastline = 38]EnvironmentSetup.env

where, a custom markerset may be used by providing the path to the marker definition file with the field “UseCustomMarkers”. The keyword “combinedModel_Save” collates all the models (human + all objects) into one Lua file.

3.2 Model creation and export

Environment files are loaded by ModelCreator and the options are processed to create the corresponding models. The human and object models are created sequentially and separately. Note that the definition and creation of objects is optional, and the only mandatory fields to create the human model are the anthropometric information (Sec. 2.2.1), human model description file (Sec. 2.3) and the choice of scaling algorithm (Sec. 2.2.1).

The general procedure for human model creation is shown in Fig. 7. Objects are created similarly but with fewer available functionalities as detailed in Tab. 1. First, we build the kinematic chain of the model based on the description file. The individual segments then get populated with data from a scaling algorithm (or object setup), the transformations between parent and child body, the joint DoFs connecting the parent and child, the points (if any) and the point constraints (if any). Finally, if defined, the position of the markers are computed and added to the model.

After successful model creation, the results may be exported into individual Lua files or combined together in one Lua file. These options are controlled via the buttons provided in the GUI (Fig. 5) or via the “humanModel_Save” and “combinedModel_Save” keywords in the environment file.

3.3 Use case

The toolbox includes a wide range of sample environment and model files under the folder data/samples, as well as additional documentation (see toolbox-folder structure in Fig. 8). We also include an example where a human model and experimentally recorded walking data is used to compute joint angle and joint torques using inverse kinematics and inverse dynamics. Inverse kinematics is solved as a least-squares optimization problem using the open-source software Puppeteer111 [3]. Inverse dynamics is computed using the open-source rigid-body dynamics library RBDL222 [1]. Both Puppeteer and RBDL are compatible with the Lua model export format currently offered by the ModelFactory toolbox. Additional code is included for reading the model fields such as points and constraints that were described in previous sections. This use-case example is available in the folder data/samples/use-case-walking.

3.4 Future Developments

Further developments of the toolbox are planned in several ways. First, it would be interesting to include other model export formats (e.g URDF) to allow the models to be more widely used. The toolbox functionality could be extended to include model fields related to actuation. For humans this could be joint actuators such as muscle torque generators [6] and line-type muscle models [21]. In this context, the models of objects with actuators (e.g. active exoskeletons, prostheses) could also be interesting. This would open up the possibility of creating robot models alongside their motor characteristics, sensors etc. The model applications considered here are limited to the kinematics and dynamics of rigid-body systems. In general, it is of interest to incorporate deformable models and those with wobbling masses (e.g. [22]). Some of the developement ideas mentioned here are the focus of ongoing work and will be published on the public repository as available.

4 Conclusions

The ModelFactory toolbox detailed here allows users to create a wide range of multi-body models in a quick and standardized manner. For applications requiring a number of subject-specific human models (e.g. [7]), or those where a model is varied in a systematic manner, such a model creation toolbox can save the user time and effort. Using the extensive examples as templates, users can also extend the toolbox to include model customizations specific to their application. Note that the ModelFactory toolbox only generates model files that may be used to conduct further analysis of human movements. For example, with methods such as inverse kinematics and inverse dynamics (e.g. [4]). Alternatively the models can be used to simulate movements with methods such as optimal control [5, 6]. The toolbox is published as open-source software and is compatible with other open-source softwares (e.g. RBDL [1] and Puppeteer [3]) that can be used with the generated model files.

5 Availability and requirements

Project name: ModelFactory

Operating system(s): Platform independent

Programming language: Matlab / Octave

Other requirements: none

Any restrictions to use by non-academics: none

6 List of abbreviations

GUI - Graphical User Interface; RBDL - Rigid Body Dynamics Library; URDF - Unified Robot Description Format; XML - Extensible Markup Language; DoF - Degree of Freedom

Model Type
Functionality Human Object
Anthropometry X
Model description X X
Scaling algorithms X
Custom scaling X
Joint types X X
Points X X
Point constraints X X
Custom markers X X
Custom setups X
Segment mass from mesh X
Segment mass from user X
Table 1: Overview of the functionalities currently provided in the toolbox for the two model types. ’X’ marks the availability of the functionality for that model type.
Figure 8: Overview of the folder structure of the ModelFactory toolbox and location of important files. Most user-related files are present in the customSetups folder. For example, new dictionary definitions should be added to the dict_definition.m script, and new object setup functions should be added to the customSetups/setups folder. Several sample environment files and a use-case are available in the data/samples folder. The meshes used to visualize the model are available in the data/meshes folder.


Ethics approval and consent to participate

Not Applicable

Consent for publication

Not applicable.

Availability of data and material

The data relevant to this work are included in this article’s supplementary information files. Future developments are available on the ModelFactory repository,

Competing interests

The authors declare that they have no competing interests.


Financial support by the European Commission within the H2020 project SPEXOR (GA 687662) is gratefully acknowledged. We acknowledge financial support by Deutsche Forschungsgemeinschaft and Ruprecht-Karls-Universität Heidelberg within the funding programme Open Access Publishing.

Authors’ contributions

MS developed the software architecture with assistance from MH. Both authors read and approved the final manuscript.


We thank Matthew Millard, Kevin Stein and Katja Mombaur for assistance with software development, contributions to the code and helpful discussions.


  • [1] Felis, M.L.: RBDL: an efficient rigid-body dynamics library using recursive algorithms. Autonomous Robots, 1–17 (2016)
  • [2] Featherstone, R.: Rigid Body Dynamics Algorithms. Springer, New York (2008)
  • [3] Felis, M.L., Mombaur, K., Berthoz, A.: An optimal control approach to reconstruct human gait dynamics from kinematic data. In: IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), pp. 1044–1051 (2015)
  • [4] Sreenivasa, M., Chamorro, C.J.G., Gonzalez-Alvarado, D., Rettig, O., Wolf, S.I.: Patient-specific bone geometry and segment inertia from MRI images for model-based analysis of pathological gait. Journal of Biomechanics 49(9), 1918–1925 (2016)
  • [5] Sreenivasa, M., Millard, M., Felis, M., Mombaur, K., Wolf, S.I.: Optimal control based stiffness identification of an ankle-foot orthosis using a predictive walking model. Frontiers in Computational Neuroscience 11(23) (2017)
  • [6] Millard, M., Sreenivasa, M., Mombaur, K.: Predicting the motions and forces of wearable robotic systems using optimal control. Frontiers in Robotics and AI 4(41) (2017)
  • [7] Harant, M., Sreenivasa, M., Millard, M., Sarabon, N., Mombaur, K.: Parameter Optimization for Passive Spinal Exoskeletons Based on Experimental Data and Optimal Control. In: Proceedings of the 2017 IEEE-RAS International Conference on Humanoid Robots (2017)
  • [8] Geyer, H., Herr, H.: A muscle-reflex model that encodes principles of legged mechanics produces human walking dynamics and muscle activities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 18(3), 263–273 (2010)
  • [9] Sreenivasa, M., Soueres, P., Nakamura, Y.: On Using Methods from Robotics to Study Human Task-dependent Balance During Whole-body Pointing and Drawing Movements. In: Proceedings of the 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, pp. 1353–1358 (2012)
  • [10] Sherman, M.A., Seth, A., Delp, S.L.: Simbody: multibody dynamics for biomedical research. Procedia IUTAM 2(Supplement C), 241–261 (2011). IUTAM Symposium on Human Body Dynamics
  • [11] Vahrenkamp, N., Kröhnert, M., Ulbrich, S., Asfour, T., Metta, G., Dillmann, R., Sandini, G.: Simox: A robotics toolbox for simulation, motion and grasp planning. In: International Conference on Intelligent Autonomous Systems (IAS), pp. 585–594 (2012)
  • [12] Wieber, P.-B., Billet, F., Boissieux, L., Pissard-Gibollet, R.: The HuMAnS toolbox, a homogenous framework for motion capture, analysis and simulation. In: International Symposium on the 3D Analysis of Human Movement, Valenciennes, France (2006).
  • [13] Kleesattel, A., Clever, D., Funken, J., Potthast, W., Mombaur, K.: Modeling and Optimal Control of Able-bodied and Unilateral Amputee Running, pp. 164–167 (2017)
  • [14] de Leva, P.: Adjustments to Zatsiorsky-Seluyanov’s segment inertia parameters. Journal of Biomechanics 29(9), 1223–1230 (1996)
  • [15] Dempster, W.T.: The anthropometry of body action. Annals of the New York Academy of Sciences 63(4), 559–585 (1955)
  • [16] Jensen, R.K.: Body segment mass, radius, and radius of gyration proportions of children. Journal of Biomechanics 19(5), 359–368 (1986)
  • [17] Zatsiorsky, V.: The mass and inertia characteristics of the main segments of the human body. Biomechanics, 1152–1159 (1983)
  • [18] MakeHuman: Open Source tool for making 3D characters.
  • [19] Vicon: Lower body modeling with Plug-in Gait.
  • [20] Seth, A., Matias, R., Veloso, A.P., Delp, S.L.: A biomechanical model of the scapulothoracic joint to accurately capture scapular kinematics during shoulder movements. PLOS ONE 11(1), 1–18 (2016)
  • [21] Delp, S.L., Anderson, F., Arnold, A., Loan, P., Habib, A., John, C., Guendelman, E., Thelen, D.G.: Opensim: Open-source software to create and analyze dynamic simulations of movement. IEEE Transactions in Biomedical Engineering 54, 1940–1950 (2007)
  • [22] Gruber, K., Ruder, H., Denoth, J., Schneider, K.: A comparative study of impact dynamics: wobbling mass model versus rigid body models. Journal of Biomechanics 31(5), 439–444 (1998)
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description