Business analytics and wider research has increasingly been moving to cloud-based services and providers. For geospatial analysis, the transition has been somewhat slow but we may now begin to see the pace increasing. Companies are now beginning to develop platforms that enable geospatial analytics to be cloud-based, while other options developed by analysts are also available.
Increasingly, challenges are seen at large spatial scales, where researchers are conducting analysis not only at global scales but their research needs also require fine-grained analytics as well as improved visualization at multiple time and spatial scales. Increasing access to multiple data platforms, from satellites to different land-based sensors, means that researchers can conduct assessments at large and small spatial scales, but limitations had been the lack of access to computing resources that have high throughput, data storage, visualization, and analytics that make it easy for a researcher or analyst. Descartes Labs, based in New Mexico, has created a cloud-based geospatial analytics platform, focusing on enterprise and real-time data cataloging and modeling, to facilitate cloud-based geospatial analysis for its users. The intent is to facilitate data access and sharing to enable rapid deployment of models and analysis that would be difficult otherwise, given the increasingly large volume of data available for analytics. The analytical tools combine petabyte-scale geospatial data, a workbench composed of varied visualization and modeling tools, including machine learning methods, and application builder to customize user-driven analysis and experiences to enable new forms of assessment to be made.
The move to more cloud-based solutions had been developing for some time and spatial analysis had already been utilizing high performance computing (HPC). For example, the Virtual Fire platform was developed with the idea that managers and analysts interested in assessing fire probability in given areas require model output that is not complex to put together and that attaining the needed computational capability be easy. Virtual Fire allows this by moving analysis to an HPC system; however, its capabilities could be improved by moving data to cloud-based platforms that allow data storage and transfer for the analyses deployed in forecasting fire movements and impact.
A recent study looking at landscape ecology and landscape change due to human-environment interactions did use a cloud-based approach for data storage and analytics that allowed a detailed visualization to be developed over a fine spatial scale (down to a few meters) that can also be scaled to larger, country-level visualization, such as the entire United States. This was done using Amazon’s EC2 cloud resource, with analysis, visualization, and data storage all combined in one system rather than having to split these tasks and applications as had been done previously. The high rate of data transfer create problems of throughput and connectivity between cloud and user ends. New forms of cloud-based computing used for geospatial analysis include Fog-based computing, which attempts to increase throughput and reduce latency for clients accessing cloud-based data and analytics. This, in effect, helps to minimize slow transmission of data that may occur due to connectivity of clients and bandwidth. This is done by optimizing what data are sent to the cloud, in part using machine learning to determine data packets needed, and reducing data transmission to the most useful parts.
The cloud is seen as a likely solution for challenges in spatial processing of large volumes of data, while providing the ease of visualization and analysis to be conducted from a simple workstation. Companies such as Descartes Lab are emerging as leaders in providing platform solutions, but open source and other options are available for users. Very likely, we will see more migration of spatial and GIS work to cloud-based platforms in the near future. Accessibility will likely become a major issue as researchers will increasingly want the ease of access to cloud-based solutions without having to pay large fees to use these powerful systems.
 For more on challenges in analyzing digital Earth ‘big data’, see: Li, Y., Yu, M., Xu, M., Yang, J., Sha, D., Liu, Q., Yang, C., 2020. Big Data and Cloud Computing, in: Guo, H., Goodchild, M.F., Annoni, A. (Eds.), Manual of Digital Earth. Springer Singapore, Singapore, pp. 325–355. https://doi.org/10.1007/978-981-32-9915-3_9.
 For more on the Descartes Lab Platform for cloud-based geospatial analytics, see: https://medium.com/descarteslabs-team/introducing-the-descartes-labs-platform-dfe308a68364.
 For more on using parallel computing and HPC, including benefits to moving to a cloud-based system for fire modeling, see: Kalabokidis, K., Athanasis, N., Gagliardi, F., Karayiannis, F., Palaiologou, P., Parastatidis, S., Vasilakos, C., 2013. Virtual Fire: A web-based GIS platform for forest fire control. Ecological Informatics 16, 62–69. https://doi.org/10.1016/j.ecoinf.2013.04.007.
 For more on using cloud-based analyses, data processing, and visualization for landscape change, see: Deng, J., Desjardins, M.R., Delmelle, E.M., 2019. An interactive platform for the analysis of landscape patterns: a cloud-based parallel approach. Annals of GIS 25, 99–111. https://doi.org/10.1080/19475683.2019.1615550.
 For more on Fog-based computing for cloud-based systems in spatial analysis, see: Barik, R.K., Priyadarshini, R., Lenka, R.K., Dubey, H., Mankodiya, K., 2020. Fog Computing Architecture for Scalable Processing of Geospatial Big Data: International Journal of Applied Geospatial Research 11, 1–20. https://doi.org/10.4018/IJAGR.2020010101