NASA and IBM are working together to create foundation models based on NASA’s data sets — including geospatial data — with the goal of accelerating the creation of AI models. Foundation models are trained on large, broad data sets, then used to train other AI models by using targeted and smaller datasets.

One real-world example of a foundation model at work is ChatGPT3, which was built with the foundation model, GPT3.

Two AI Model Goals for NASA and IBM https://research.ibm.com/blog/what-are-foundation-models can be powerful: It originally took IBM seven years to train Watson in 12 languages. By using a foundation model, IBM accelerated Watson’s language abilities to 25 languages in approximately one year.

Related Articles