Blog
The latest news from the Terra Bella team
Happy Birthday, SkySat-1!
Tuesday, November 24, 2015
It’s hard to believe that just 2 years ago we
launched
our first satellite, SkySat-1. Who would have known such a new technology built by a small team would be able to pioneer the first commercial sub-meter imagery, high-resolution full motion
video
, and nighttime
video
from a small satellite. To us, SkySat-1 will always symbolize the power of camaraderie and perseverance.
To commemorate SkySat-1’s birthday, we created the montage below with over 22,500 downsampled thumbnails of images collected by SkySat-1 during its second year in space. The full, 600 DPI image, is downloadable
here
.
As SkySat-1 and SkySat-2 continue to capture deep stacks of beautiful images and videos across the globe, we have been busy over the last year working to leverage Google’s resources and smarts to lay the foundation to scale. Next year will be a big year for us as we prepare for the launch of 10+ next generation satellites on 3 different rockets from 3 different continents.
Onwards and upwards!
To download the full resolution version of the graphic, click
here
.
Close up view of the SkySat image tiles.
SkySat-1 image of Jubail, Saudi Arabia on SkySat-1’s birthday (November 21, 2015)
MapReduce for C: Run Native Code in Hadoop
Wednesday, February 18, 2015
We are pleased to announce the release of
MapReduce for C
(MR4C), an open source framework that allows you to run native code in
Hadoop.
MR4C was originally developed at Skybox Imaging to facilitate large scale satellite image processing and geospatial data science. We found the job tracking and cluster management capabilities of Hadoop well-suited for scalable data handling, but also wanted to leverage the powerful ecosystem of proven image processing libraries developed in C and C++. While many software companies that deal with large datasets have built proprietary systems to execute native code in MapReduce frameworks, MR4C represents a flexible solution in this space for use and development by the open source community.
MR4C is developed around a few simple concepts that facilitate moving your native code to Hadoop. Algorithms are stored in native shared objects that access data from the local filesystem or any uniform resource identifier (URI), while input/output datasets, runtime parameters, and any external libraries are configured using JavaScript Object Notation (JSON) files. Splitting mappers and allocating resources can be configured with Hadoop YARN based tools or at the cluster level for MRv1. Workflows of multiple algorithms can be strung together using an automatically generated configuration. There are callbacks in place for logging and progress reporting which you can view using the Hadoop JobTracker interface. Your workflow can be built and tested on a local machine using exactly the same interface employed on the target cluster.
If this sounds interesting to you, get started with our documentation and source
code
at the
MR4C GitHub page
. The goal of this project is to abstract the important details of the MapReduce framework and allow users to focus on developing valuable algorithms. Let us know how we're doing in our
Google Group
.
Posted by Ty Kennedy-Bowdoin, Platform Processing Product Manager
Archive
2016
Sep
Jun
Apr
Mar
2015
Nov
Feb
2014
Nov
Oct
Sep
Jul
Jun
Apr
Mar
Feb
Jan
2013
Dec
Nov
May
2012
Apr
2011
Jun
Follow @terra_bella