  {"id":1603,"date":"2025-02-06T08:49:05","date_gmt":"2025-02-06T13:49:05","guid":{"rendered":"https:\/\/carleton.ca\/spacecraft\/?page_id=1603"},"modified":"2026-03-26T15:12:22","modified_gmt":"2026-03-26T19:12:22","slug":"datasets","status":"publish","type":"page","link":"https:\/\/carleton.ca\/spacecraft\/datasets\/","title":{"rendered":"Computer Vision Datasets"},"content":{"rendered":"\n<section class=\"w-screen px-6 cu-section cu-section--white ml-offset-center md:px-8 lg:px-14\">\n    <div class=\"space-y-6 cu-max-w-child-5xl  md:space-y-10 cu-prose-first-last\">\n\n            <div class=\"cu-textmedia flex flex-col lg:flex-row mx-auto gap-6 md:gap-10 my-6 md:my-12 first:mt-0 max-w-5xl\">\n        <div class=\"justify-start cu-textmedia-content cu-prose-first-last\" style=\"flex: 0 0 100%;\">\n            <header class=\"font-light prose-xl cu-pageheader md:prose-2xl cu-component-updated cu-prose-first-last\">\n                                    <h1 class=\"cu-prose-first-last font-semibold !mt-2 mb-4 md:mb-6 relative after:absolute after:h-px after:bottom-0 after:bg-cu-red after:left-px text-3xl md:text-4xl lg:text-5xl lg:leading-[3.5rem] pb-5 after:w-10 text-cu-black-700 not-prose\">\n                        Computer Vision Datasets\n                    <\/h1>\n                \n                                \n                            <\/header>\n\n                    <\/div>\n\n            <\/div>\n\n    <\/div>\n<\/section>\n\n\n\n<p>One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of the pose (i.e., relative position and orientation) of a target object with respect to the robotic chaser spacecraft equipped with computer vision sensors. The relative pose of a target represents crucial information upon which real-time guidance, trajectory control, and docking\/capture maneuvers are planned and executed. Besides relative pose determination, other computer vision tasks such as detection, 3D model reconstruction, component identification, and foreground\/background segmentation are often required.<\/p>\n\n\n\n<p>Before deploying any computer vision algorithms in operational scenarios, it is important to ensure they will perform as expected, regardless of the harsh lightning conditions encountered in low-Earth orbit and even when applied to unknown uncooperative targets. To address the need to train and\/or test computer vision algorithms across a wide range of scenarios, conditions, target spacecraft, and visual features, the Spacecraft Robotics Laboratory has developed labeled datasets of images of various target spacecraft, ranging from actual on-orbit imagery to synthetic rendered models.<\/p>\n\n\n\n<h2 id=\"event-based-observation-of-spacecraft-evos-dataset\" class=\"wp-block-heading\">EVent-based Observation of Spacecraft (EVOS) Dataset<\/h2>\n\n\n\n<p>The EVOS (EVent-based Observation of Spacecraft) dataset is designed to support research in event-based vision for autonomous on-orbit inspection and space debris removal. The dataset was collected at ÐÓ°ÉÔ­´´ University&#8217;s Spacecraft Proximity Operations Testbed using an IniVation DVXplorer Micro event camera mounted on a stationary chaser spacecraft platform which observes a moving target spacecraft. The observed target is covered in multi-layer insulation and is equipped with a solar panel and a docking cone. The dataset contains 15 unique experiments, each approximately 300 seconds long, featuring distinct trajectories under different lighting conditions. Time-synchronized ground-truth position and velocity data are provided via a PhaseSpace motion capture system with sub-millimeter accuracy.<\/p>\n\n\n\n<p>Crain, A., Ulrich, S., &#8220;EVent-based Observation of Spacecraft (EVOS) Dataset,&#8221; Federated Research Data Repository, 2025. <a href=\"https:\/\/doi.org\/10.20383\/103.01538\">https:\/\/doi.org\/10.20383\/103.01538<\/a>. <\/p>\n\n\n\n<h3 id=\"related-publications\" class=\"wp-block-heading\">Related Publications<\/h3>\n\n\n\n<p>Crain, A., and Ulrich, S., &#8220;Event-Based Spacecraft Representation Using Inter-Event-Interval Adaptive Time Surfaces,&#8221; <i>36th AIAA\/AAS Space Flight Mechanics Meeting, Orlando, <\/i>FL, 12-16 Jan, 2026.<\/p>\n\n\n\n<h2 id=\"spacecraft-thermal-infrared-stir-dataset\" class=\"wp-block-heading\">Spacecraft Thermal Infrared (STIR) Dataset<\/h2>\n\n\n\n<figure class=\"wp-block-image alignleft\"><img loading=\"lazy\" decoding=\"async\" width=\"240\" height=\"180\" src=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/data_01014-240x180.png\" alt=\"\" class=\"wp-image-1700\" srcset=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/data_01014-240x180.png 240w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/data_01014-160x120.png 160w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/data_01014.png 320w\" sizes=\"auto, (max-width: 240px) 100vw, 240px\" \/><\/figure>\n\n\n\n<p>The Spacecraft Thermal Infrared (STIR) Dataset is used primarily to train machine learning-based pose determination methods applied to spacecraft proximity operations. STIR consists of 16,641 thermal infrared pictures of a free-floating spacecraft target platform, captured at ÐÓ°ÉÔ­´´ University&#8217;s Spacecraft Proximity Operations Testbed by an ICI-9320 thermal camera installed on the chaser spacecraft platform. For each of the frames, the relative planar three-degree-of-freedom pose of the target with respect to the chaser is included (x and y positions, and yaw angle). This ground truth data was acquired by a PhaseSpace motion capture system with submillimeter accuracy.<\/p>\n\n\n\n<p>Budhkar, A., and Ulrich, S., &#8220;Spacecraft Thermal Infrared (STIR) Dataset,&#8221; Federated Research Data Repository, 2025. <a href=\"https:\/\/doi.org\/10.20383\/103.01333\">https:\/\/doi.org\/10.20383\/103.01333<\/a>.<\/p>\n\n\n\n<h3 id=\"related-publications\" class=\"wp-block-heading\">Related Publications<\/h3>\n\n\n\n<p><span class=\"art_authors\"><span class=\"NLM_string-name\">Budhkar, A.<\/span>, and <span class=\"NLM_string-name\">Ulrich, S.,<\/span> &#8220;Neural Network-Based Spacecraft Pose Determination Using Thermal Imagery,&#8221; <\/span><span class=\"journalName\"><i>AAS\/AIAA Space Flight Mechanics Meeting, <\/i>Kaua&#8217;i, HI, 19-23 Jan, 2025<\/span><span class=\"year\">.<\/span><\/p>\n\n\n\n<h2 id=\"satellite-segmentation-satseg-dataset\" class=\"wp-block-heading\">Satellite Segmentation (SATSEG) Dataset<\/h2>\n\n\n\n<figure class=\"wp-block-image alignleft\"><img loading=\"lazy\" decoding=\"async\" width=\"240\" height=\"324\" src=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-240x324.png\" alt=\"\" class=\"wp-image-1607\" srcset=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-240x324.png 240w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-160x216.png 160w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-768x1037.png 768w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-400x540.png 400w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-1137x1536.png 1137w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-1517x2048.png 1517w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/sites\/229\/SATSEG-360x486.png 360w\" sizes=\"auto, (max-width: 240px) 100vw, 240px\" \/><\/figure>\n\n\n\n<p>The SATellite SEGmentation (SATSEG) dataset is used to benchmark segmentation methods applied to space-based applications. SATSEG consists of 100 color and grayscale pictures of actual spacecraft, and laboratory mockup, captured by visual and thermal cameras. Some of the spacecraft in this dataset include: Cygnus, Dragon, ISS, Space Shuttle, Cubesats, Hubble, Orbital Express, and Radarsat. Also included in SATSEG is manually-produced ground-truth data that provides a binary mask for each image (foreground = 255 \/ background = 0).<\/p>\n\n\n\n<p>Shi, J.-F., and Ulrich, S., &#8220;Satellite Segmentation (SATSEG) Dataset&#8221;, <a href=\"https:\/\/doi.org\/10.5683\/SP3\/VDAN02\" target=\"_blank\" rel=\"noopener\">https:\/\/doi.org\/10.5683\/SP3\/VDAN02<\/a>, Borealis (Ed.), 2018.<\/p>\n\n\n\n<h3 id=\"related-publications\" class=\"wp-block-heading\">Related Publications<\/h3>\n\n\n\n<p>Shi, J.-F., and Ulrich, S., \u201cUncooperative Spacecraft Pose Estimation using Monocular Monochromatic Images,\u201d <em>AIAA Journal of Spacecraft and Rockets,<\/em> Vol. 58, No. 2, 2021, pp. 284\u2013301.<\/p>\n\n\n\n<p><span class=\"art_authors\"><span class=\"NLM_string-name\">Shi, J.-F.<\/span>, <span class=\"NLM_string-name\">Ulrich, S.,<\/span> and <span class=\"NLM_string-name\">Ruel, S.,<\/span> &#8220;Regional Method for Monocular Infrared Image Spacecraft Pose Estimation,&#8221; <\/span><span class=\"journalName\"><i>AIAA SPACE and Astronautics Forum and Exposition, <\/i>AIAA Paper 2018-5281, Orlando, FL, <\/span><span class=\"year\">2018.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of the pose (i.e., relative position and orientation) of a target object with respect to the robotic chaser spacecraft equipped with computer vision sensors. The relative pose of a target represents crucial information upon which [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"_cu_dining_location_slug":"","footnotes":"","_links_to":"","_links_to_target":""},"cu_page_type":[],"class_list":["post-1603","page","type-page","status-publish","hentry"],"acf":{"cu_post_thumbnail":""},"_links":{"self":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/comments?post=1603"}],"version-history":[{"count":4,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603\/revisions"}],"predecessor-version":[{"id":1815,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603\/revisions\/1815"}],"wp:attachment":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/media?parent=1603"}],"wp:term":[{"taxonomy":"cu_page_type","embeddable":true,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/cu_page_type?post=1603"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}