Main navigation | Main content

HOME » SCIENTIFIC RESOURCES » Volumes

Abstracts and Talk Materials

We show that a Vietoris-Rips complex or a restricted Cech complex on a finite set of points from the circle is homotopy equivalent to either a point, an odd sphere, or a wedge sum of spheres of the same even dimension. The restricted Cech complex is defined as the nerve of balls in the circle, i.e. the nerve of circular arcs. The homotopy type of a Vietoris-Rips complex on evenly-spaced circular points was proven by Michal Adamaszek in "Clique complexes and graph powers." Joint work with Michal Adamaszek, Christopher Peterson, and Corrine Previte.

Suppose ball-shaped sensors wander in a bounded domain. A sensor doesn't know its location but does know when it overlaps a nearby sensor. We say that an evasion path exists in this sensor network if a moving intruder can avoid detection. Vin de Silva and Robert Ghrist give a necessary condition, depending only on the time-varying connectivity data of the sensors, for an evasion path to exist. Using zigzag persistent homology, we provide an equivalent condition that moreover can be computed in a streaming fashion. However, no method with time-varying connectivity data (i.e. Cech complexes) as input can give necessary and sufficient conditions for the existence of an evasion path. Indeed, we show that the existence of an evasion path depends on more than just the fibrewise homotopy type of the region covered by sensors. In the setting of planar sensors that also measure weak rotation information, we provide necessary and sufficient conditions for the existence of an evasion path, and we pose an open question concerning Cech and alpha complexes. Joint with Gunnar Carlsson.

The second part of my talk is joint work with Michal Adamaszek, Christopher Peterson, and Corrine Previte. We show that a Vietoris-Rips complex or a restricted Cech complex on a finite set of points from the circle is homotopy equivalent to either a point, an odd sphere, or a wedge sum of spheres of the same even dimension.

The second part of my talk is joint work with Michal Adamaszek, Christopher Peterson, and Corrine Previte. We show that a Vietoris-Rips complex or a restricted Cech complex on a finite set of points from the circle is homotopy equivalent to either a point, an odd sphere, or a wedge sum of spheres of the same even dimension.

Read More...

http://www.ece.ncsu.edu/people/agentim

How do mathematicians interact? How do they collaborate to create papers? In this project we created a model that describes those interactions in terms of collaborations. We then used Persistent Homology to compute obstructions, similar to information flow obstructions in computer networks. The main ideas presented in this poster are:

The creation of a metric amongst mathematicians. The Socioplex. Persistent Homology and obstructions.

The creation of a metric amongst mathematicians. The Socioplex. Persistent Homology and obstructions.

The task of caging, that is constraining the movement of a body in 2- or 3-dimensional space with a few obstacles, is important from the viewpoint of robotics, and translates into a neat elementary topological setup. I will survey a few old and new results in this direction.

In this talk I will introduce some techniques for topological reasoning within the purview of graph search-based motion planning.
Classically, in robotics and artificial intelligence literature, a popular approach to dealing with complexities in configuration spaces (high dimension or topological non-trivialities) is to create a graph by placing vertices at discrete locations in the configuration space and establishing edges between the "neighbouring" vertices. This approach is simple and is suitable for design of efficient search algorithms like A* and Dijkstra's, and for incremental construction without global information about the entire configuration space to begin with. Such algorithms focus on and use the graph alone for solving problems like path planning, coverage and exploration, while discarding the richer topological information of the original configuration space. The lost information, for example, makes it is impossible to distinguish between paths in the graph that belong to different homotopy or homology classes in the original configuration space.
Algebraic topology gives us the tools to "generalize" the notion of a graph to higher dimensions so as to represent the richer topological features of the space as a chain complex (e.g., a simplicial complex). However, there is no natural extension of graph search algorithms like Dijkstra's or A* to such complexes that are efficient and can be executed without knowing the entire complex from the very beginning.
In this talk I will reveal tools and techniques that can be used efficiently as add-ons on graph search algorithms like Dijkstra's and A*, that lets us efficiently keep track of homology classes while searching for optimal paths in the graph. In particular, I will introduce a way of computing closed, non-exact differential 1-forms, the integration of which along singular cycles give complete invariants of their homology classes in the free space, and thus explain how this integration can be effectively used to modify graph search algorithms for planning optimal trajectories with consideration of homology classes. This, I will illustrate for N-dimensional Euclidean spaces punctured by obstacles. Although we will be using de Rham cochain complex and its pairing (integration) on singular chain complex with coefficients in the reals, I will give an outline of how this technique can be modified to consider homology with coefficients in Z_2, and thus illustrate the benefit of doing so. I will also introduce similar techniques for reasoning about homotopy classes of trajectories while planning optimal trajectories on a plane punctured by obstacles.
After introducing the fundamental theoretical and algorithmic tools, I will demonstrate how these techniques can be applied to efficiently solve some real problems in robotics.

There is a minimum number of cuts to the configuration spaces of n points in the plane such that a continuous control can be defined. The control can define the movement of a set of points, and by introducing the cuts to that configuration space the continuous control can be simplified. This simplification reduces cost during movement, and prevents collisions between points. For certain sets of stabilization points, we have found that the minimum number of cuts is small relative to the whole space.

http://www.math.ntnu.no/~botnan/

The Cech complex is one of the most widely used tools in applied algebraic topology.
Unfortunately, due to the inclusive nature of the Cech filtration, the
number of simplices grows exponentially in the number of input points. A practical
consequence is that computations may have to terminate at smaller scales than what
the application calls for. We propose two methods to approximate the Cech
persistence module. Both constructions are built on the level of spaces, i.e. as sequences of simplicial
complexes induced by nerves. We also show how the bottleneck distance between
such persistence modules can be understood by how tightly they are sandwiched on
the level of spaces. In turn, this implies the correctness of our approximation methods.
Finally, we implement our methods and apply them to some example point clouds
in Euclidean space.

We consider a dynamic coverage problem in a sensor network, where the sensors are following a sleep-wake protocol in order to save energy. By applying periodic protocol, we show that the waves of wake-state sensors automatically solve pursuit/evasion-type problems. We also observed that the propagating waves are represented by cohomology classes of the underlying space, and proved every cohomology class is realizable. As a corollary, we extend our theory to cyclic network automata, which eventually has periodic dynamic.

I will present a distributed algorithm to localize coverage holes in sensor networks. A rips complex built from the communication graph is used to represent the coverage information. I will show that harmonics, the elements of the kernel of the first Laplacian, can be effectively utilized to infer topology distributively. This followed by a ``divide-and-conquer'' strategy will help us localize the coverage holes efficiently. When the proximity information is noisy, which is usually the case in real networks, the Rips complex built from the communication graph is unreliable. This problem is reasonably mitigated using topological persistence. I will show that the underlying framework used will also allow us to check for persistence distributively.

Topological maps have been used in the mobile robot community, starting with Ben Kuipers insightful work on distinctive places and edges that connected them. This map was a variation of the Voronoi diagram, which I use in my work to robot exploration. In this talk, I will put forward a definition of a topological map and show why the Voronoi diagram is a good topological map. Sadly, this is the only example of a good topological map that I can find, and I will talk about some attempts to make good topological maps in higher dimensional spaces. Finally, I will discuss some problems that exploit a topological map to achieve tasks including SLAM, optimal planning, and hybrid controls.

Consider a region covered by "sensors" with a report of the number of agents within each "sensor".
The question of the total number of agents, the minimum number of agents or the most likely number of agents
reported by these "sensors" will be discussed in terms of topology as well as "hard" problems within linear programming.
This talk is based on joint work with Bill Moran, Wang Zengfu, and Doug Cochran

In this work, we consider a minimalistic approach to vehicle localization that constrains the vehicle’s ability to sense its environment to a binary detection of uniquely identifiable landmarks having unknown position (e.g., a Wi-Fi transceiver detecting Access Point MAC addresses in an urban setting). Central to the proposed solution are dual simplicial nerve complexes that approximate the topology of the environmental free-space. The notion of a “hole” within these complexes naturally represents a physical structure (e.g., a building) that limits landmark visibility/communication with respect to the vehicle’s location. Taking advantage of this property, we formulate a “homological sensing model” that operates on these constructs enabling the vehicle to “count” the number of structures within its vicinity using local homology computations as a pseudo-metric surrogate sensor. The information from such homological sensor is used in the update step of a Monte-Carlo localization algorithm. Vehicle location is estimated by correlating the measured number of locally sensed holes with an unlabeled metric map (e.g. Google map). Initial experimental results using WiFi MAC address visibility data, collected over a 13 mile tour of a dense urban environment, will be discussed.

Persistence modules are the algebraic objects at the heart of various
theories of persistence. I will discuss some recent work with Bubenik and
Scott on generalised persistence modules (viewed categorically), as well
as some other studies with Chazal, Oudot and Glisse on the structure of
1-parameter persistence modules (the classical case). The idea is to move
beyond the usual restriction to modules with "finitely many critical
points" to allow much broader families of examples.

We exploit minimal sensing information gathered from biologically inspired sensor networks to perform exploration and mapping in an unknown environment. A probabilistic motion model of mobile sensing nodes, inspired by motion characteristics of cockroaches, is utilized to extract weak encounter information in order to build a topological representation of the environment.
Neighbor to neighbor interactions among the nodes are exploited to build point clouds representing spatial features of the manifold characterizing the environment based on the sampled data.
To extract dominant features from sampled data, topological data analysis is used to produce persistence intervals for features, to be used for topological mapping. In order to improve robustness characteristics of the sampled data with respect to outliers, density based subsampling algorithms are employed. Moreover, a robust scale-invariant classification algorithm for persistence diagrams is proposed to provide a quantitative representation of desired features in the data.
Our approach is analyzed using simulation and experimental data using a swarm robotic platform called the WolfBot.

Distributed sensing, information and computation based on sensor networks comprise a new frontier in engineering and science. Cheap, easily available sensors performing collectively complicated tasks are forthcoming, and in security and other applications involving sensor networks the coverage of a region of interest is of a high importance.
In this talk we will quickly summarize progress made over the last eight years in topological sensing and detection, and propose a revised definition of topological coverage which makes all the topological criteria more intuitive. Then we will propose a distributed method to compute (persistent) homology with theoretical guarantees over a sensor network. In conclusion we will consider a problem of sensor repairs in working sensor network which used to cover the region of interest.

The length of a curve in Euclidean space is an elementary geometric concept,
and it is well defined provided the curve is not wild. We consider the problem
of computing the length of curves or tube-like shapes, such as root systems of
plants. Branching is allowed, but the real difficulty lies in the small but
positive thickness, which renders length an undefined concept, at least in the
mathematical sense. Noticing the abundance of tube-like shapes in nature and
therefore in the sciences, we aim at producing a length estimate that is stable
under perturbations of the shape.

This is joint work with Florian Pausinger.

This is joint work with Florian Pausinger.

The Steiner polynomial of a solid body in R^d is of degree d
and describes the volume as a function of the thickening parameter
(parallel body). There are d+1 coefficients which are used to define
the d+1 intrinsic volumes of the solid body. In d=2 dimensions,
the first intrinsic volume is the length, and in d=3 dimensions,
it is the total mean curvature of the boundary. Using an integral
geometric approach, we modify the formula using persistent moments
to get a measure for approximating bodies that converges to the first
intrinsic volume of the solid body.

We show that the first and top homology groups of the chain complex of Sullivan diagrams of the topological type of the punctured disk are trivial. We compute all the homology groups of the chain complex of Sullivan diagrams of the topological type of the disk with up to seven punctures and we give generators for the non-trivial groups. We use these generators to give two infinite families of non-trivial classes of the homology of Sullivan diagrams of topological type the generalized pair of pants and show that these give non-trivial string operations.

HAP is a software package for basic computations in homotopy theory, especial those related to the fundamental group. It is distributed as part of the GAP system for computational algebra. This poster displays a few calculations that can currently be made with the software.

I will discuss several probabilistic models producing simplicial complexes, manifolds and discrete
groups. Random simplicial complexes are high dimensional analogues of random graphs and can be
used for studying the behaviour of large systems or networks depending on many random
parameters. We are interested in properties of random spaces which are satisfies with probability
tending to one. Using probabilistic models one may also test probabilistically the validity of open
topological questions such as the Whitehead and the Eilenberg – Ganea conjectures.

A dynamic sensor network may be represented as a sequence of simplicial complexes, and analyzed using zigzag persistent homology, with the resulting barcode containing birth-death intervals for homological features of the sequence. In this talk I will first describe the relationship between these intervals and the time-varying coverage holes in the network, as well as the scenarios in which the zigzag persistence barcode works well as a coverage descriptor. Further, I will propose a method for making a geometrically-relevant choice of representative cycle for each feature at its birth and/or death time, and propagating that information over the entire interval. This can allow “tracking” of the coverage holes in the network. Additionally, these representative cycles are used in conjunction with a hop-distance filtration to determine size estimates for the holes over time. This estimated size information is then used to enhance the regular barcode, giving a visual and quantitative descriptor of the time-varying network coverage.

In this talk, I present a new method to detect robust common topological structures of two geometric objects. The idea is to extend the notion of persistent homology to representations on a commutative triple ladder quiver. (i) I show that representations on the commutative triple ladder quiver are finite type. (ii) The Auslander-Reiten quiver of the commutative triple ladder, which lists up all the possible isomorphism classes of indecomposable persistence modules and irreducible morphisms among them, is explicitly derived. In addition, the notion of persistence diagrams is generalized to graphs on the Auslander-Reiten quiver. (iii) An algorithm for computing indecomposable decompositions by using the Auslander- Reiten quiver is presented. (iv) A numerical example to detect robust common topological features is shown.

Ayasdi technology is based on the ideas of Topological Data Analysis (TDA), and produces meaningful topological graphs across a multitude of data types (including mixed data types). Given a notion of similarity, Ayasdi will produce topological graphs which show the shape of data, generating insights into the underlying phenomena of the data set. TDA provides a method for studying numerous properties of point cloud data sets.
1. TDA permits novel ways of visualizing and compressing data sets, which facilitate understanding and monitoring of data.
2. TDA quickly identifies interrelationships among disparate data sets.
3. TDA can be conducted at various levels of resolution and scale. This supports rapid and interactive analysis of data.

With these basic principles, the shapes within a topological data map generate insights into the data. This work is an implementation of the TDA theory. The data used in this work was simulated to represent moving objects, and we built neighborhood graphs that characterize the local relationships in the data. We will discuss additional graphs that can represent the underlying neighborhood graph in a more readily digested format.

With these basic principles, the shapes within a topological data map generate insights into the data. This work is an implementation of the TDA theory. The data used in this work was simulated to represent moving objects, and we built neighborhood graphs that characterize the local relationships in the data. We will discuss additional graphs that can represent the underlying neighborhood graph in a more readily digested format.

Read More...

An important problem with sensor networks is that they do not provide infor- mation about the regions that are not covered by their sensors. If the sensors in a network are static, then the Alexander Duality Theorem from classic algebraic topology is sufficient to determine the coverage of a network. However, in many networks the nodes change position with time. In the case of dynamic sensor net- works, we consider the covered and uncovered regions as parametrized spaces with respect to time. Parametrized homology is a variant of zigzag persistent homology that measures how the homology of the levelsets of the space changes as we vary the parameter. We will present a few theorems that extend different versions of classical Alexander Duality theorem to the setting of parametrized homology theories. This approach sheds light on the practical problem of ‘wandering’ loss of coverage within dynamic sensor networks.

Recently, spectral graph theory has shed light on some elusive aspects of electromagnetic modeling of (magnetic) electric guitar pickups- specifically, the ability of an expert to infer the manner in which the pickup was wound by listening. The pickup is often modeled as an RLC circuit. However, this simple model does not account for the aspect of the “tone” that is characterized by how the pickup was wound. Psycho-acoustic experiments reveal that any acoustically accurate model has to reproduce the first 30 milliseconds of the transient response with extreme precision since it is within this timeframe that the brain makes inferences about the pertinent aspects of the pickup’s tone.

We propose a model which is not amenable to simple-minded model order reduction and is impractical for brute force numerical simulations. However, by focusing on modeling electromagnetic details and exposing a connection to spectral graph theory, a framework for analyzing the transient response to sufficient detail can be developed. It turns out that the model is insensitive to how the pickup was wound with the exception of the inter-winding capacitance matrix which describes the capacitance between the (typically 5,000-8,000) turns of the winding. In the proposed model, this sparse matrix can be reconstructed from the adjacency matrix of its associated graph. In this manner the variations in in the pickup’s tone are entirely dependent on the spectrum of this adjacency matrix and in particular, the variations in the “attack” of the pickup’s transient response due to how the pickup was wound, relate to how the spectrum of the adjacency matrix reflects the connectivity of the graph.

We propose a model which is not amenable to simple-minded model order reduction and is impractical for brute force numerical simulations. However, by focusing on modeling electromagnetic details and exposing a connection to spectral graph theory, a framework for analyzing the transient response to sufficient detail can be developed. It turns out that the model is insensitive to how the pickup was wound with the exception of the inter-winding capacitance matrix which describes the capacitance between the (typically 5,000-8,000) turns of the winding. In the proposed model, this sparse matrix can be reconstructed from the adjacency matrix of its associated graph. In this manner the variations in in the pickup’s tone are entirely dependent on the spectrum of this adjacency matrix and in particular, the variations in the “attack” of the pickup’s transient response due to how the pickup was wound, relate to how the spectrum of the adjacency matrix reflects the connectivity of the graph.

We will discuss some topology-based techniques in the area of signal processing with applications to Sensor/social networks as well medical signal processing.
We will furthermore describe some highly efficient implementations bringing them closer to practice.

Homology on semimodule-valued sheaves naturally generalizes network flows from the setting of numerical capacity constraints to other sorts of constraints (e.g. stochastic, multicommodity). In this talk, we present new work relating the algebraic structure of flows with local network properties and algebraic properties of the ground semiring. For example, a sheaf-valued flow admits an interpretation as an equalizer if the in-degree or out-degree of each vertex is no more than 1 or the stalks are invertible or the stalks are flat - applications include calculational methods. For another example, a constant semiring-valued flow decomposes into loops under certain factorization properties of the semiring. For yet another example, semimodule-valued sheaves under which homology satisfies an exactness property exhibit rigid constraints.

We define a simple, explicit map sending a morphism $f:Mto N$ of pointwise finite dimensional persistence modules to a matching between the barcodes of $M$ and $N$. Our main result is that, in a precise sense, the quality of this matching is tightly controlled by the lengths of the longest intervals in the barcodes of $ker{f}$ and $coker{f}$.

As an immediate corollary, we obtain a new proof of the algebraic stability of persistence barcodes, a fundamental result in the theory of persistent homology (due originally to Chazal et al., building on work of Cohen-Steiner et al.) In contrast to previous proofs, ours shows explicitly how a $delta$-interleaving morphism between two persistence modules induces a $delta$-matching between the barcodes of the two modules. Our main result also specializes to a structure theorem for submodules and quotients of persistence modules.

As an immediate corollary, we obtain a new proof of the algebraic stability of persistence barcodes, a fundamental result in the theory of persistent homology (due originally to Chazal et al., building on work of Cohen-Steiner et al.) In contrast to previous proofs, ours shows explicitly how a $delta$-interleaving morphism between two persistence modules induces a $delta$-matching between the barcodes of the two modules. Our main result also specializes to a structure theorem for submodules and quotients of persistence modules.

In this work, we describe a prototype of an
automatic segmentation and annotation system for endoscopy images. The algorithm
classifies using topological features of the image. The pipeline consists of three steps:
image processing, calculation of topological features, and final classification.

In-hand manipulation means moving or turning an object in the hand. For example, when you pick up a hammer, you have to shift it to the proper grasp before using it. Most studies of in-hand manipulation work by wiggling the fingers for small motions, or walking the fingers along the object surface for larger motions. But there have been several studies in the past of other approaches. An object can be rolled in the hand, just using dynamics and gravity. Or an object's pose can be adjusted by pressing it against an external contact. Or the hand can toss the object in the air and then catch it in a different pose. These all have one thing in common: instead of the intrinsic motion capabilities of the hand, they rely on extrinsic resources: external surfaces, dynamic motion capabilities of the arm, and gravity. This talk will present recent work on this class of actions, which we call "extrinsic dexterity". The talk will demonstrate several extrinsically dexterous operations and outline some of the challenges and planned work. Because of the challenges of developing physics models of these operations, we are using machine learning techniques to derive stochastic models from practice.

This work is a collaboration with Nikhil Chavan Dafle, Alberto Rodriguez, Robert Paolini, Bowei Tang, Siddhartha S. Srinivasa, Michael Erdmann, Ivan Lundberg, Harald Staab and Thomas Fuhlbrigge.

This work is a collaboration with Nikhil Chavan Dafle, Alberto Rodriguez, Robert Paolini, Bowei Tang, Siddhartha S. Srinivasa, Michael Erdmann, Ivan Lundberg, Harald Staab and Thomas Fuhlbrigge.

In order to understand the properties of a real-valued function on a topological space, we can study the Reeb graph of that function. The Reeb graph is a construction which summarizes the connectivity of the level sets. Since it is efficient to compute and is a useful descriptor for the function, it has found its place in many applications. As with many other constructions in computational topology, we are interested in how to deal with this construction in the context of noise. In particular, we would like a method to "smooth out" the topology to get rid of, for example, small loops in the Reeb graph.
In this talk, we will define a generalization of a Reeb graph as a functor. Using the added structure given by category theory, we can define interleavings on Reeb graphs which can be used to compare them. This also gives an immediate method for topological smoothing and we will discuss an algorithm for computing this smoothed Reeb graph.

This is joint work with Vin de Silva and Amit Patel.

This is joint work with Vin de Silva and Amit Patel.

I'll describe a framework for studying what happens when one imposes various structural conditions on clustering schemes and tries to identify all methods that comply with such conditions. Within this framework, it is possible to prove a theorem analogous to one of J. Kleinberg, in which one obtains existence and uniqueness instead of a non-existence result. We also obtain a full classification of all clustering schemes satisfying a condition we
refer to as excisiveness. The classification can be changed by varying the notion
of maps of finite metric spaces and by doing so gives rise to families of clustering schemes that exhibit different degrees of sensitivity to density. Several of the constructions can carried out on non-symmetric spaces such as networks.

Topological complexity is a measure of how difficult
themotion planning problem is for robots constrained to move according
to a specified configuration space X. If X is contractible, then its
topological complexity is 0. Odd spheres have TC=1 and even spheres have
TC=2. A major problem in the area is to understand the topological
complexity of aspherical spaces. Since these spaces have a fundamental
group G and no higher homotopy groups, the goal would be to understand
TC(K(G,1))
in terms of the subgroup structure, say, of G. This work shows how
cohomological dimension may be used in conjunction with G's subgroup
structure to obtain new lower bounds for TC(K(G,1)).

Online coverage is perhaps the most common task undertaken by autonomous mobile robots. Much like cleaning a house or mowing a lawn, the objective of this task is to move the robot over every floor tile or patch of the environment, while avoiding collision with obstacles.

While such robots are usually battery operated, new challenges constantly arise which pose greater energy demands than an on-board battery can provide. For instance, by autonomous robots that require high power, in environments where information can only be transmitted via a communication cable, or as a reliable positioning device in high slippage applications.

While such robots are usually battery operated, new challenges constantly arise which pose greater energy demands than an on-board battery can provide. For instance, by autonomous robots that require high power, in environments where information can only be transmitted via a communication cable, or as a reliable positioning device in high slippage applications.

This paper is concerned with online tethered coverage, in which a mobile robot of size D is attached to a fixed point S by a cable of finite length L. Starting at S, the robot has to cover an unknown planar environment containing obstacles, and return to S with the cable fully retracted. The paper first establishes an optimal off-line tethered coverage methodology, then introduces the TC algorithm that performs online tethered coverage using position and local obstacle detection sensors. The performance of the TC algorithm is measured by its competitiveness, determined by measuring its total online path length, l, relative to the optimal off-line solution l_opt. The paper establishes that the TC algorithm has a competitive performance of l = 3 l_opt /2. Execution example and experiments with a tethered recoiling mechanism illustrate the usefulness of the TC algorithm.

Winds influence the ocean surface in a complicated (and not well-understood) way that depends on ocean chemistry, density, and temperature. Understanding the small-scale structure of winds over the ocean surface is critical for understanding the impact of certain rapidly-evolving environmental problems, such as oil spills, algal blooms, and floating debris. The key factor limiting our understanding of weather patterns over the ocean is low resolution wind data from outdated sensors and overly simplistic analytic methods. Current satellite-borne wind measuring systems give wind measurements spaced 2.5 km apart; our approach yields similarly accurate measurements a few hundred meters apart, largely due to the availability of higher resolution data products from TerraSAR-X. Our focus is on the spatial variability of the ocean surface at small scales (tens of meters), and therefore largely addresses the visibility of gusts. This poster shows the result of analyzing a high-resolution ocean image from the German satellite TerraSAR-X using a topological filter, and its subsequent detection of small spatial-scale anomalies in the ocean wave field.

Recently, sheaves have become useful for addressing problems in signal processing. Morphisms between sheaves provide a handy formal construct for understanding the relationship between measurements, intermediate data, and processed outputs. The resulting topological filters generalize the linear filters that engineers use extensively, but also describe novel, nonlinear filters. Because they are built from sheaves, the local structure of these filters can be tailored easily and may provide a solid theoretical grounding for nonlinear matched filters. I'll exhibit some recent results of data processed using these techniques including wind field extraction from satellite imagery.

In this talk I will discuss some problems involving persistence computation with a highlight on two particular cases. The first example looks at the problem of approximating relative persistent homology by extending the results on the stability of persistence modules. The second example will look at how to compute persistence with bounds on the memory usage using a spectral sequence approach. I will discuss how we encounter similar problems in both examples.

In this talk we will describe methodologies to localize both a single and a team of vehicles navigating in a complex environment without GPS. During the first part of the talk, we will consider the situation when vehicles (or a single vehicle navigating in an environment with multiple beacons) can measure their relative (inter-vehicle) distances. In this case, the problem can be posed as a distributed graph embedding problem. The convergence speed of this type of algorithms strongly depends on the spectral property of the underlying network (graph), and we will discuss a distributed clustering algorithm that can accelerate the distributed embedding problem for slowly varying networks. Next, we will consider situations where vehicle teams can only coarsely compare relative distance measurements. We will show that the problem can be posed, once again, as a graph embedding problem and demonstrate how this information can be fused with inertial sensors. Finally, we discuss the limiting case when a vehicle navigating in an environment, with multiple beacons, can coarsely localize itself by leveraging visibility (binary detection) information as well as a low-grade inertial sensor. Key to this final result is leveraging local homology computations to enable successful loop closure in a probabilistic filter. Initial experimental results for this last scenario will be presented.

The intrinsic volumes generalize both Euler characteristic and Lebesgue volume, quantifying the size of a set in various ways. A random cubical complex is a union of (possibly high-dimensional) unit cubes selected from a lattice according to some probability model. We analyze the intrinsic volumes of various types of random cubical complexes, obtaining polynomial formulae for the expected value and variance of these intrinsic volumes. For our primary model, we prove an interleaving theorem about the roots of the expected intrinsic volumes -- that is, the values of the probability parameter at which an expected value is zero. This work is motivated by the study of noise in digital images, and is in collaboration with Michael Werman of The Hebrew University of Jerusalem.