大纲:

Module 1: Accelerating Materials Development and Deployment

Module 2: Materials Knowledge and Materials Data Science

Module 3: Materials Innovation Cyberinfrastructure and Integrated Workflows

Module 4: Materials Knowledge Improvement Cycles

Module 5: Case Study in Homogenization:

Plastic Properties of Two-Phase Composites

Module 6: Process-Structure Evolution Linkages

Module 1. Module 1: Accelerating Materials Development and Deployment

1.1 importance-of-accelerated-and-integrated-materials-and-product-design

To summarize what we've learned in this first lecture. First, the 20th century grand challenges and advancement have involve materials in the 21st century grand challenges will involve materials. This is a very safe bet. Modeling and simulation in computations exploded over the last 15 to 20 years, and it leads us to the idea that we can reduce the time from discovery of materials to deployment of products by taking advantage of, exploiting. Data sciences and informatics to couple with modeling and simulation and experiments to be able to accelerate that. In revolutions in modeling and in big data, computational methods, Internet, Internet of things, these usher in this new era of possibility to accelerate materials discovery of development.

1.2 historical-materials-development-and-deployment-cycles

Typically it cost 20 years for materials discovered in universitise or industry research laboratores to find their way into products.

Why does it cost so long? Well, after discovering the material one has to optimized properties, but it can't find its way in their product yet. It has to undergo a rigorous, stringent certification process. It has to meet the many requirements. It has to be cost effective and so forth. So, largely we've relied on empirical routes.

It's interesting that this cycle time from the discovery of a material to its deployment hasn't changed much over this period of time. For example, if we consider Charles Goodyear's work on vulcanized rubber in the mid 1800s. It was an important discovery that's enabled modern automotive tires. So, crosslinking, the crosslinking of molecules has allowed rubber to withstand heat and cold. And it's given us huge consumer benefits in a wide range of applications. This was discovered serendipitously in other words empirically. The question is can we use modelling and simulation, data sciences, data mining, machine learning, the elements of data sciences to advance this rate.

In summary, we can say that the historical trend of about 20-25 years to translate a materials discovery in the laboratory to a product. It's been something that's been hard to break out of. And materials discovery by chance for empiricism has been this traditional route, but we are breaking out of it. We are promised to do so, with the intersection of modeling and simulation, advanced experimental techniques and the coupling with data sciences and informatics. This represents an important shift towards the integration of materials and product design. To be able to match more rapidly in a customized way translate our needs into products.

1.3 how-do-we-accelerate-materials-development-and-deployment

We've talked about top down design having significant challenges and a dose of reality is essential to understanding how we can move forward in advancing methods for materials design and development. And most importantly, for our purposes here, the data sciences can play a key role in reducing the number of experiments, the degree of impericism that exists historically in this process. And to provide a better support for decision making in the process of design and development of new and improved materials

1.4 emergence-of-multi-stakeholder-initiatives

One could accelerate the design and development of materials that previously taken 20 to 30 years down to a time frame in the order of three to five years. This important demonstration then served as a basis for industry to support the Integrated Computational Materials Engineering initiative, or ICME initiative. In which engineering and modeling a simulation via materials engineering are brought together by integrating these computational tools with decision processes in materials development, using various kinds of methodologies.

This entire method or philosophy of accelerating the insertion of materials really relies on two driving forces. One is the consumer demand for improved materials with an enhanced functionality, lighter weight, lower cost is ever increasing. And secondly, that the time to introduce new materials into the market place has traditionally been on the order of 15 to 20 years when now the time frame for design and prototyping is down to something in the order of 18 to 36 months for most consumer products. So that kind of disconnect between the time scales of developing materials and designing new products is part of what we're trying to get rid of or eliminate with the Materials Genome Initiative.

In summary, we've discussed how the accelerated materials discovery and development enterprise has developed rapidly in the last 15 years in particular. Including major initiatives in the US such as Integrated Computational Materials Engineering and the Materials Genome Initiative. And more broadly, these initiatives have been adopted abroad, in Asia and in Europe, so it's an international enterprise. Materials Genome Initiative has articulated a need to link experiments, computational modeling, simulation, and data sciences together as a major 21st century initiative. To accelerate the development and insertion of materials. And stakeholders of accelerated materials, discovery development, span the entire economy, ranging from academia, government industry and consumers.

1.5 the-materials-innovation-ecosystem

The Materials Innovation Ecosystem is characterized by at the center a focus on the integration of computation, experiment, and data sciences. All wrapped together, synchronized via High Throughput Methodologies.

e-collaboration is more than just email messaging but involves enhancement of information in data content is a very important one, with regard to the future of materials design and development.

There are some important caveats about this materials innovation ecosystem. And distinctions of by what we mean by it compared to previous and fairly recent initiatives. One is to recognize that yes this is a big data problem. But it's not just high performance computing. It's not just problems operating on the large number of processors in parallel. It's not about just boutique materials, you know, designer materials. This is about also improving materials that are standard materials that are used extensively in products. For example, concrete and steel that can be improved with these kinds of approaches. It's not just about high fidelity predictive modeling and simulation. It's also about how we make optimal use of even meta-models or reduced order models to provide decision support in the development process. And it's not just nanotechnology, to be certain, it has nanotechnology elements associated with the primal atomic level predictive simulation of material structure and properties. But that's just a part of the entire puzzle, since the real materials that are used in products upscale has many other elements associated with meta stability non equilibrium and processing, not necessarily subject to the nanotechnology realm. It's also not just an implication for large scale national user facilities for materials research and development. But rather ways in which we can achieve distributed mass usage of already existing facilities. Networked and linked in a more efficient and productive manner. And it's not just materials information infrastructure or cyberinfrastructure. It's the intelligent deployment and development of new and improved methods by which people can collaborate to produce gains towards enhanced decision support and materials development.

1.6 part1: multiscale-modeling-and-multilevel-design-of-materials-with-structure-hierarchy

Learning outcomes:

  • Materials structure hierarchy
  • Mesoscale gaps in multiscale modeling (mesoscale gap is one in which we don't have good models, highly reliable, low uncertainty types of models. So oftentimes, because this domain is quite metastable in nature, meaning it can be strongly affected by temperature, environment, we have to rely on some experimental protocols to understand the phenomena and to build simple reduced order models to represent this)
  • Hierarchical vs concurrent multiscale modeling (层次型和并发型多尺度模拟,这两种都可以帮助人们了解材料性质在尺度之间的变化,然而后者的模型应用还是比较少的,因为这需要多种尺度之间的模拟信息需要相互交互。因此目前广泛被使用的还是HIerarchical multiscale modeling)
  • Distinction between multiscale and multilvel design
  • The role of uncertainty
  • Key enabling elements in multilevel degign of materials

Why multiscale modelling?

  • Properties are scale specific. Modeling at selective scales of hierarchy provides decision support for materials develoment
  • Uncertainties of models at vairous scales and multiscale transitions are prevalent.
  • Sensitivity of responses to microstructure must be understood in order to tailor materials to ahieve requied performance

1.6 part2: multiscale-modeling-and-multilevel-design-of-materials

Multiscale modeling is distinct from multilevel design and serves the purpose of multilevel design. The importance of uncertainty, quantification and management in providing decision support.

1.7 decision-making-in-systems-based-robust-materials-design

Decision making under uncertainty

1.8 multilevel-decision-based-design

Module 2: Materials Knowledge and Materials Data Science

Keys for accelerated materials innovation:

  • standardrized representation of materials internal structure: objective, braodaly applicalble, and low-dimensional

  • High througput protocols for rapid exploration of vast spaces for materials design

  • Uncertainty quantification in all datasets (experimental and modelling)

  • Data-driven protocols leading to objective decision making in materials development efforts

  • e-collaboration platform (we need a lot of different expertise from different stakeholders)

-Digital recording of workflows to document successes and filures to establish best practices

terminology in materials informatics

Data: quantitative/qualitative, 2D or 3D ... hence data is broadly defined.

Data collection: experiments, models, simulations.

Data integerity: a measure of trust in the data (accuracy, completeness etc)

Data quality: a meansure how worthy or useful the data is for decision making

Data mining: a process of exploring large datasets and identifying patterns that can be used to predict outcomes on new data.

Metadata: loosely definded as data about data - the data that describes the actual content of a dataset.

Structured and unstructured data: their difference is the level of organization of data

Database: an organized collection of data

main components of data science

In overview, the components of data science is 1) data management 2) data analytics 3) e-collaborations thourgh email/skype/google hangouts/gitlab to get/share data.

Some integrated e-collaborations tools: nanoHUB for nanomateirals and Galaxy for biology

What is big data?

Big data definition/scope: large and complex data.

Fie V's challenge in big data: 1) Volume 2) Velocity 3) Variety 4) Veracity (accuracy) 5) Value

  • Volume: vast amount of data generated every second. For example, multi-day beam line can generate 20 terabytes and even petabytes.

  • Velocity: unprecedented speed velicity at which data is generated

  • Varity: from different sources

  • Varacity: referes to the trustworhiness of the data, we also call this 'data quality'.

  • Value:

Module 4: Materials Knowledge Improvement Cycles

digital-representation-of-material-structure

Understand these concepts: lcoal state, local state space, microstructure function, the probability density or the prababilities.

Note: 1) discrete outcomes aloow us to define the prababilities (e.g. tossing a coin), 2) continous outcomes only allow us to define prabability densities m(h,x,t); 3) experiments typically produce only discretized information suitable to evaluating the probabilities m(h,x)

spatial-correlations-n-point-statistics