Main navigation | Main content

HOME » SCIENTIFIC RESOURCES » Volumes

Abstracts and Talk Materials

Consider the Vietoris-Rips complex of the circle of unit circumference equipped with the geodesic metric. Its persistent homology diagram contains an interval of positive length in every odd dimension. In particular, the persistence diagram for (2k+1)-dimensional homology consists of the open interval with birth time k/(2k+1) and death time (k+1)/(2k+3), and over this interval the Vietoris-Rips complex is homotopy equivalent to the (2k+1)-dimensional sphere.

Joint work-in-progress with Michal Adamaszek,Christopher Peterson, and Corrine Previte.

Joint work-in-progress with Michal Adamaszek,Christopher Peterson, and Corrine Previte.

The Poisson and Martin Boundaries of a planar graph will be discussed, and shown to be equivalent to the geometric boundary when the planar embedding is sufficiently nice. Based on joint work with Barlow, Gurel-Gurevich and Nachmias.

We have introduced the weight of a group which has a presentation with number of relations is at most the number of generators. We have shown that the number of vertices in any crystallization of a connected closed 3-manifold $M$ is at least the weight of the fundamental group of $M$. This lower bound is sharp for the 3-manifolds $mathbb{R P}^3$, $L(3,1)$, $L(5,2)$, $S^1times S^1 times S^1$, $S^{hspace{.2mm}2} times S^1$, $S^{hspace{.2mm}2}
mbox{$times hspace{-5mm}_{-}$} , S^{hspace{.1mm}1}$ and $S^{hspace{.2mm}3}/Q_8$, where $Q_8$ is the quaternion group.
Moreover, there is a unique such facet minimal crystallization in each of these seven cases. We have also constructed crystallizations of $L(kq-1,q)$ with $4(q+k-1)$ facets for $q, k geq 2$ and $L(kq+1,q)$ with $4(q+k)$ facets for $q, kgeq 1$. By a recent result of Swartz, our crystallizations of $L(kq+1, q)$ are facet minimal when $kq+1$ are even. Our construction of a crystallization of a 3-manifold $M$ is based on a presentation of the fundamental group of $M$.

Given a finite point set X, one constructs the Cech complex at scale r by
taking the nerve of the subset covered by r-balls centered at the points of
X.

Given a direction vector u and a scale 0 < c < 1, we construct a cover by ellipsoids with major axis parallel to u and length r. The normal directions to u have length cr. We then compute the homology of the complex obtained as the nerve of this cover. This additional structure defines a generalized persistence module, yields a correspondence between rotations and interleavings of persistence modules and suggests an approach to detect directed homological features.

This is joint work with Athanasios Gentimis at NC State.

Given a direction vector u and a scale 0 < c < 1, we construct a cover by ellipsoids with major axis parallel to u and length r. The normal directions to u have length cr. We then compute the homology of the complex obtained as the nerve of this cover. This additional structure defines a generalized persistence module, yields a correspondence between rotations and interleavings of persistence modules and suggests an approach to detect directed homological features.

This is joint work with Athanasios Gentimis at NC State.

A large variety of complex systems, from the brain to the weather networks and complex infrastructures, are formed by several networks that coexist, interact and coevolve forming a "network of networks". Modeling such multilayer structures and characterizing the rich interplay between their structure and their dynamical behavior is crucial in order to understand and predict complex phenomena. In this talk I will present recent works on statistical mechanics of multiplex networks. Multiplex networks are formed by N nodes linked in different layers by different networks. I will present models that generate multiplexes with different types of correlations between the layers, and characterize new percolation phenomena on multiplex networks, showing first order phase transitions, bistability or a complex phase diagram with tricriticals points and higher order critical points.?

The talk will provide an overview of complex information systems including quantifying, managing, and designing heterogeneous networked systems. Methods of measuring and assessing the performance of networked, software, and hardware integrated systems such as cloud architectures will be discussed including techniques of sparse approximation in systems measurements, and algebraic and topological statistical metrics for performance. Strategies of quantifying risk over different geometric and statistical classes of distributed systems will be examined as well as methods of tracking and coding dynamic information flows.

Dr. Bonneau is a Division Chief of the Information, Decision and Complex Networks Division at the Air Force Office of Scientific Research, and has established programs in the information theory and mathematics of networking and computing in the Mathematics, Information, and Biological Sciences Division. He is also on the staff of the Assistant Secretary of Defense in Research and Engineering as a specialist in communications and information technology and a Lecturer at George Washington University in the Statistics Department. Previously, Dr. Bonneau was a Senior Research Scientist at the Air Force Research Laboratory, Information Directorate in networking, communications, sensing, and computing, and a Program Manager at the Defense Advanced Research Projects Agency (DARPA) in communications. He has held academic positions in communications and sensing research at Rensselaer Polytechnic Institute and Columbia University. Dr. Bonneau has a Ph.D. in electrical engineering from Columbia University, and a Masters and Bachelors in electrical engineering from Cornell University. Dr. Bonneau is a Senior Member of IEEE and has over 80 journal and conference papers, 1 book co-authorship, contributed to 2 book chapters, and holds 3 patents.

Dr. Bonneau is a Division Chief of the Information, Decision and Complex Networks Division at the Air Force Office of Scientific Research, and has established programs in the information theory and mathematics of networking and computing in the Mathematics, Information, and Biological Sciences Division. He is also on the staff of the Assistant Secretary of Defense in Research and Engineering as a specialist in communications and information technology and a Lecturer at George Washington University in the Statistics Department. Previously, Dr. Bonneau was a Senior Research Scientist at the Air Force Research Laboratory, Information Directorate in networking, communications, sensing, and computing, and a Program Manager at the Defense Advanced Research Projects Agency (DARPA) in communications. He has held academic positions in communications and sensing research at Rensselaer Polytechnic Institute and Columbia University. Dr. Bonneau has a Ph.D. in electrical engineering from Columbia University, and a Masters and Bachelors in electrical engineering from Cornell University. Dr. Bonneau is a Senior Member of IEEE and has over 80 journal and conference papers, 1 book co-authorship, contributed to 2 book chapters, and holds 3 patents.

Random graph models are commonly used as a null-hypothesis in the study of
real-world networks. The random graphs generated by such models are used to
deduce the significance of properties of real networks. As such, it is
important that they sample uniformly from a set of networks that resemble
the real network under study. For instance, it is common to sample from a
specific network class, such as the class of simple undirected networks. It
is also common to fix the degree sequence of the random networks, since the
degree distribution has a big impact on other network properties.

Directed acyclic networks are an important class of networks, they appear in biology, computer science, engineering, pure mathematics and statistics. The defining property of a directed acyclic network is that its vertices can be ordered such that all edges are directed from "new" to "old". Such an ordering on the vertices is called a topological ordering.

Existing random graph models introduce unnatural features when used to randomize this class of networks, such as directed cycles and multiple edges. We propose the ordered switching method to overcome the shortcomings of existing models. As its name suggests, it is a type of switching method. We prove that this method samples uniformly from all directed acyclic networks with fixed in-degree and out-degree sequence and fixed topological ordering.

Since directed acyclic networks are an important class of networks and random graph models are widely used to find significant properties of real networks, there are many applications for this new random network model.

Directed acyclic networks are an important class of networks, they appear in biology, computer science, engineering, pure mathematics and statistics. The defining property of a directed acyclic network is that its vertices can be ordered such that all edges are directed from "new" to "old". Such an ordering on the vertices is called a topological ordering.

Existing random graph models introduce unnatural features when used to randomize this class of networks, such as directed cycles and multiple edges. We propose the ordered switching method to overcome the shortcomings of existing models. As its name suggests, it is a type of switching method. We prove that this method samples uniformly from all directed acyclic networks with fixed in-degree and out-degree sequence and fixed topological ordering.

Since directed acyclic networks are an important class of networks and random graph models are widely used to find significant properties of real networks, there are many applications for this new random network model.

In the talk, after an introduction of the cop and robber game and Gromov hyperbolicity, we will outline the proof that all cop-win graphs G in the game in which the robber and the cop move at different speeds s and s' with s'0, this establishes a new --game-theoretical-- characterization of Gromov hyperbolicity. Using these results, we describe a simple constant-factor approximation of hyperbolicity of a graph on n vertices running in O(n^2log n) time. Joint work with J. Chalopin, P. Papasoglu, and T. Pecatte.

Read More...

Harish Chintakunta (North Carolina State University)

http://www4.ncsu.edu/~hkchinta/index.html

Hamid Krim (North Carolina State University)

http://www.ece.ncsu.edu/people/ahk

http://www4.ncsu.edu/~hkchinta/index.html

Hamid Krim (North Carolina State University)

http://www.ece.ncsu.edu/people/ahk

The persistent/zig-zag persistent intervals provide a
meaningful topological summary of a given filtration/sequence of
spaces. However, there often arises the need to compute sparse
generators for these intervals which serve to "track" the features
corresponding to the intervals. We will address this problem on two
aspects: 1) simplification of the data in order to reduce the
computational complexity while preserving the desired topological
features , and 2) computing a specific summand decomposition of the
given persistent/zig-zag module where the summands result in the
desired generators. Further, given a choice of a set of filtrations, we
will present meaningful ways to choose the right filtration where the
resulting bar-code reflects some underlying geometric features.

In this talk I will survey some results on statistical properties of matchings of very large and infinite graphs. The main goal of the talk is to describe a few applications of a new concept called matching measure. These applications include new results on the number of (perfect) matchings in large girth graphs as well as simple new proofs of certain statistical physical theorems. This is joint work with various subsets of Miklós Abért, Péter E. Frenkel, Tamás Hubai and Gábor Kun.

Recent data sparsification strategies in topological data analysis such as Graph Induced Complex and sparsified Rips complex give rise to a sequence of simplicial complexes connected by simplicial maps rather than inclusions. As a result, the need for computing topological persistence under such maps arises. We propose a practical algorithm for computing such persistence in Z2-homology.

A simplicial map can be decomposed into a set of elementary inclusions and vertex collapses--two atomic operations that can be supported efficiently with the notion of simplex annotations for computing persistent homology. A consistent annotation through these atomic operations implies the maintenance of a consistent cohomology basis, hence a homology basis by duality. While the idea of maintaining a cohomology basis through an inclusion is not new, maintaining them through a vertex collapse is new, which constitutes an important atomic operation for simulating simplicial maps. Annotations support the vertex collapse in addition to the usual inclusion quite naturally. We exhibit an application of this new tool in which we approximate the persistence diagram of a filtration of Rips complexes where vertex collapses are used to tame the blow-up in size.

A simplicial map can be decomposed into a set of elementary inclusions and vertex collapses--two atomic operations that can be supported efficiently with the notion of simplex annotations for computing persistent homology. A consistent annotation through these atomic operations implies the maintenance of a consistent cohomology basis, hence a homology basis by duality. While the idea of maintaining a cohomology basis through an inclusion is not new, maintaining them through a vertex collapse is new, which constitutes an important atomic operation for simulating simplicial maps. Annotations support the vertex collapse in addition to the usual inclusion quite naturally. We exhibit an application of this new tool in which we approximate the persistence diagram of a filtration of Rips complexes where vertex collapses are used to tame the blow-up in size.

Read More...

Extracting synthetic and meaningful features from a
network is extremely hard due to the intricate influence
patterns, displayed at all the scales, and often also to
its huge size. A novel topological way to approach this
problem is to adapt persistent homology theory to
networks. This can be naturally done for non-directed
graph using clique complex construction.
Our aim is to extend the persistent homology approach
to directed graphs. For these graphs the main problem is
the identification of clique complexes due to the presence
of double links and loops. In our work we have developed
a mapping that allows to build an undirected graph in a
consistent way i.e., such that the information about the
directivity of the original graph is preserved. While this
procedure implies a proliferation of the nodes it provides
the basic object, the undirected graph, on which the
clique complexes identification is possible and the
machinery of persistent homology can be applied. While
this construction is fairly general we have discussed its
applicability to a specific kind of directed graphs i.e.,
those describing Markov processes.

Joint work with G. Petri, F. Vaccarino and R. Lima

Joint work with G. Petri, F. Vaccarino and R. Lima

Analysis of Internet topologies at both the router and AS level has shown that the Internet topology has negative curvature, measured by Gromov’s “thin triangle condition”. Negative curvature property is tightly related to structural properties of the network, such as core congestion and route reliability. In this work we analyze the discrete Ricci curvature, defined and studied by Ollivier, Lin & Yau, etc, of the Internet topologies. Ricci curvature measures whether local distances diverge or converge and is a local measure that can be quickly computed. It is a more refined measure which allows us to understand the network structure at various resolutions. We show by various Internet data sets a few universal observations — 1) the network uses negatively curved edges to connect clusters of nodes, within which edges are mostly positively curved; 2) negative Ricci curvature is closely correlated to network congestion; 3) geodesics through negatively curved edges are more stable, under node/link perturbations. We also show that the configuration model using power law degree distribution generates networks of similar curvature distribution while G(n, p) model does not.

This is joint work with Chien Chun Ni and Xianfeng David Gu from Stony Brook University.

This is joint work with Chien Chun Ni and Xianfeng David Gu from Stony Brook University.

For graphs, expansion is a useful concept that has found applications in various areas. The success of this concept has inspired the search for a corresponding notion in higher-dimensions.
Expansion for graphs can be expressed combinatorially or via the spectrum of the Laplacian. The close connection of these two notions is expressed, e.g., by the discrete Cheeger inequality.
We describe several possible notions of expansion for simplicial complexes of higher dimension as well as results exploring connections between these properties.

This is joint work with Jared Culbertson (AFRL) and Peter Stiller (Texas A&M). Extending the work of Carlsson--Memoli, we aim to construct a method for non-nested hierarchical clustering that is based on Isbell's theory of injective spaces. Every metric space isometrically embeds in a unique (up to isometry) injective envelope that highlights canonical features of the original metric space. Inspired by the work of Bandelt--Dress on split decompositions, we seek to exploit the structure of the injective envelope to determine a non-nested clustering program which replaces tree-based dendrograms with cubical complexes. Towards this end, we present results on the injectivity of cubical complexes and discuss some of the limitations of related work.

There are a variety of ways to measure the stretching that occurs when one surface is deformed onto another. Various energies can be defined to quantify such distortions, and the magnitude of these energies can be used to compare the difference between two shapes.

We will discuss some applications of these ideas to mathematics and to biology. In topology, these ideas lead to new results on the space of triangulations of a surface. In biology, we will discuss computations used to compare brain cortices and proteins.

We will discuss some applications of these ideas to mathematics and to biology. In topology, these ideas lead to new results on the space of triangulations of a surface. In biology, we will discuss computations used to compare brain cortices and proteins.

This talk will highlight some interesting maps of topological spaces giving rise to maps of face posets and how an interplay of combinatorics and topology may help reveal structure of both the image and the fibers of the map. Much of the talk will focus on joint work with Alex Engstrom and Bernd Sturmfels regarding images of monomial maps on probability spaces, namely maps given by d monomials in n variables where these variables are probabilities, so maps from an n-dimensional unit cube to a d-dimensional unit cube. We dub the images of such maps as 'toric cubes'. The motivating example is the edge product space of phylogenetic trees. For any monomial map, we provide a cell structure for the image which in fact is a regular CW decomposition of the image; this involves introducing a new family of maps which are generically injective. As time permits, we will briefly highlight other interesting examples of generically injective maps coming from total positivity theory and from electrical networks as well as some new combinatorial-topological tools for studying homeomorphism type and fibers of such maps.

Edmond Jonckheere (University of Southern California)

http://ee.usc.edu/faculty_staff/faculty_directory/jonckheere.htm

http://ee.usc.edu/faculty_staff/faculty_directory/jonckheere.htm

Ollivier-Ricci curvature appeared in the context of Riemannian geometry as a coarse notion of the ordinary Ricci curvature. It then migrated to graphs, where its interpretation as a transportation cost (formalized under the first Wasserstein distance) clearly makes it relevant to networking problems. The first occurrence of the Ollivier-Ricci curvature is in the context of wireless networking under a protocol inspired, but different in many respects, from the Back-Pressure and referred to as Heat-Diffusion. Equipped with a concept of heat diffusion on a directed, interference network, it will be shown that the Heat-Diffusion protocol has the unique property of throughput optimality in addition to Pareto optimality relative to average queue length and routing cost. As major result, it will be shown that average queue length and routing cost both increase as the Ollivier-Ricci curvature decreases (becomes more negative). In addition, as the Ollivier-Ricci curvature decreases (becomes more negative), the capacity region shrinks. In a sense, the Ollivier-Ricci curvature plays the same role as the Gromov curvature of wireline networks. It should be noted, however, that the protocol for wireless and wireline networks are different and hence call for different curvature concept to deal with the "congestion" problem. This already gives the hint that the two curvatures cannot be related to each other. This fact is best confirmed by the tree-width problem, instrumental in mapping--and solving--a QUBO problem on a quantum adiabatic architecture. Simulation results on a variety of graphs show a simple linear interpolation relation between tree-width and Ollivier-Ricci curvature, while Gromov curvature, related to tree-length, can be demonstrated to be incomparable with tree-width.

Read More...

(This is work in progress with Dominic Dotterrer and Larry Guth.)

In a graph, the girth is the length of the smallest cycle. How large the girth can be for a graph on n vertices and m edges is a very well studied problem in combinatorics. More generally, in a d-dimensional simplicial complex, we define the d-systole to be the smallest nonempty collection of closed d-dimensional faces whose union has no boundary, and we measure the size of a systole in terms of volume, i.e. the number of faces. It is natural to ask what is the largest possible d-systole for a simplicial complex on n vertices with m top-dimensional faces.

We show the existence of simplicial complexes with large systoles using random simplicial complexes with modifications, and we also require some new results on estimating the number of triangulated surfaces on a given number of vertices. On the other hand, we show that the systoles can not be much larger than this, so these results are essentially optimal. In the higher-dimensional setting, there are surprising contrasts with the classical graph theoretic picture, and in particular the systoles can be quite large.

In a graph, the girth is the length of the smallest cycle. How large the girth can be for a graph on n vertices and m edges is a very well studied problem in combinatorics. More generally, in a d-dimensional simplicial complex, we define the d-systole to be the smallest nonempty collection of closed d-dimensional faces whose union has no boundary, and we measure the size of a systole in terms of volume, i.e. the number of faces. It is natural to ask what is the largest possible d-systole for a simplicial complex on n vertices with m top-dimensional faces.

We show the existence of simplicial complexes with large systoles using random simplicial complexes with modifications, and we also require some new results on estimating the number of triangulated surfaces on a given number of vertices. On the other hand, we show that the systoles can not be much larger than this, so these results are essentially optimal. In the higher-dimensional setting, there are surprising contrasts with the classical graph theoretic picture, and in particular the systoles can be quite large.

Random geometric graphs in Lorentzian spaces of constant positive curvature---de Sitter spaces---are shown to model well the structure and dynamics of some complex networks, such as the Internet, social and biological networks. Random geometric graphs on Lorentzian manifolds are known as causal sets in quantum gravity where they represent discretizations of the global causal structure of spacetime. If the expansion of a universe is accelerating, its spacetime is asymptotically de Sitter spacetime. Several connections between random de Sitter graphs, and Apollonian circle packings, Farey trees, divisibility network of natural numbers, and the Riemann hypothesis are also discussed.

Read More...

Given a simplicial complex K with weights on its simplices and a chain on it, the Optimal Homologous Chain Problem(OHCP) is to find a chain with minimal weight that is homologous to the given chain. The OHCP is NP-complete, but if the boundary matrix of K is totally unimodular (TU), it becomes solvable in polynomial time when modeled as a linear program (LP). We define a condition on K called non total-unimodularity neutralized (NTUN), which ensures that even when the boundary matrix is not TU, the OHCP LP must contain an integral optimal vertex for every input chain. This is joint work with Gavin Smith. A preprint is available at http://www.arXiv.org/abs/1304.4985.

Although graphs and matrices are popular ways to model data drawn from
many application domains, the size scale, sparsity properties, noise
properties, and many other aspects of graphs and matrices arising in
realistic machine learning and data analysis applications present
fundamental challenges for traditional matrix and graph algorithms.
Informally, this at root has to do with the observation that small-scale
or local structure in many data sets is often better or more meaningful
than large-scale or global structure, which is exactly the opposite of
what many traditional asymptotic tools typically implicitly assume. For
example, there might be meaningful small-scale clusters in social networks
that are tree-like or expander-like at large size scales. Here, I will
describe how eigenvector localization can be used to justify these sorts
of claims, how localization can be engineered in a principled way into
popular matrix and graph algorithms in order to characterize local
properties in large data sets much more generally, and how seemingly-minor
details in the approximate computation of localized (as well as
non-localized) objectives can make a large difference in the usefulness of
a given method in downstream applications. A common theme will be the
critical role of what can be called implicit regularization---that is,
regularization not in the usual sense of explicitly adding a
regularization term and solving the modified objective exactly, but
instead as an implicit side effect of heuristics that are often used to
make approximation algorithms scale well---as well as the question of what
is the objective function that an approximation algorithm implicitly solve
exactly.

The large-scale structure of the Internet (or, rather of the graph of the Autonomous System Nodes) has been attracting a lot of attention by the researchers for decades. One family of attractive models for this graph stipulates that it "looks like" is if sampled from a hyperbolic plane. We discuss possible tests for the dimension of samples from manifolds, and apply them to the ANS graph.

We state some results on Cheeger Inequalities for the combinatorial
Laplacian and random walks on simplicial complexes.

Specifically, for the combinatorial Laplacian we prove that a Cheeger type inequality holds on the highest dimension, or for the boundary operator with Dirichlet boundary conditions. We also show that coboundary expanders do not satisfy natural Buser or Cheeger inequalities. We provide some statements about middle dimensions.

We also introduce random walks with absorbing states on simplicial complexes. Given a simplicial complex of dimension d, a random walk with an absorbing state is defined which relates to the spectrum of the k-dimensional Laplacian. We also examine an application of random walks on simplicial complexes to a semi-supervised learning problem. Specifically, we consider a label propagation algorithm on oriented edges, which applies to a generalization of the partially labelled classification problem on graphs.

Joint work with: John Steenbergen, Caroline Klivans, Anil Hirani, and Mark Schubel

Specifically, for the combinatorial Laplacian we prove that a Cheeger type inequality holds on the highest dimension, or for the boundary operator with Dirichlet boundary conditions. We also show that coboundary expanders do not satisfy natural Buser or Cheeger inequalities. We provide some statements about middle dimensions.

We also introduce random walks with absorbing states on simplicial complexes. Given a simplicial complex of dimension d, a random walk with an absorbing state is defined which relates to the spectrum of the k-dimensional Laplacian. We also examine an application of random walks on simplicial complexes to a semi-supervised learning problem. Specifically, we consider a label propagation algorithm on oriented edges, which applies to a generalization of the partially labelled classification problem on graphs.

Joint work with: John Steenbergen, Caroline Klivans, Anil Hirani, and Mark Schubel

We introduce A-infinity persistence of a given homology filtration of topological spaces. This is a family
homological invariants which provide information not readily available by the persistent Betti numbers of the given filtration. This may help to detect noise, not just in the simplicial structure of the filtration
but in further geometrical properties in which the higher diagonals of the A-infinity structure are translated. For instance, persistence of the second diagonal translates into persistence of indecomposability of cup products. In the same way, persistence of the third diagonal translates into persistence of indecomposability of Massey products.

Joint work with Francisco Belchi

Joint work with Francisco Belchi

Descartes showed that a planar graph with n vertices, n at least 3, has at most 3n-6 edges; this follows easily from Euler's formula. We seek an analog in higher dimensions. Kalai and Sarkaria conjectured that for d-dimensional simplicial complexes embeddable in the 2d-sphere, the ratio between the number of d-faces and (d-1)-faces is bounded by a constant depending on d.
We present a new approach towards this problem, by suggesting a rigidity theory for balanced complexes, analogous to the classical framework rigidity.
Based on a joint work with Gil Kalai and Isabella Novik, arXiv:1312.0209.

There are well known connections between the random walk on a graph, and its topological and spectral properties. In this talk we will define a new stochastic process on higher dimensional simplicial complexes, which reflects their homological and spectral properties in a parallel way. This leads to high dimensional analogues (not all of which hold!) of classical theorems of Kesten, Alon-Boppana, and others. Based on a joint work with Ori Parzanchevski.

I will start describing some basics of the graph Laplacian eigenvectors of a given graph and their properties. In particular, I will describe the peculiar phase transition/localization phenomena of such eigenvectors observed on a certain type of graphs (e.g., dendritic trees of neurons). Then, I will describe how to construct wavelet packets on a given graph including the Haar-Walsh basis dictionary using the graph Laplacian eigenvectors. As an application of such basis dictionaries, I will discuss efficient approximation of functions given on graphs. This is a joint work with Jeff Irion, Yuji Nakatsukasa, and Ernest Woei.

We provide evidence that networks representing social interactions, as measured and archived by electronic communication systems, such as friendship, collaboration, co-authorship, web, peer-to-peer and other kindred networks, exhibit strong hyperbolicity in the sense of Gromov (4-point delta being much smaller than the graph diameter). We outline how one may exploit hyperbolicity to reduce computations on such graphs massively. As an example, we show that all-pair distances on a (mobile call) graph with E=100s of millions of edges may be approximated in O(E) time (i.e., in tens of minutes) with small average error (well below 10% additive error), far smaller than anticipated by theory. We speculate that this result is likely due to the uncharacteristically small hyperbolic cores of these networks (i.e., the set of nodes whose betweenness centrality scales like k*N^2 with k~0.5 as N = the node size of the graph goes to infinity).

Read More...

We introduce hierarchical quasi-clustering methods, a generalization of hierarchical clustering for asymmetric networks where the output structure preserves the asymmetry of the input data. We show that this output structure is equivalent to a finite quasi-ultrametric space and study admissibility with respect to two desirable properties. We prove that a modified version of single linkage is the only admissible quasi-clustering method. Moreover, we show stability of the proposed method and we establish invariance properties fulfilled by it. Algorithms are further developed and the value of quasi-clustering analysis is illustrated with a study of internal migration within United States.

In this talk, we discuss recent work on Gromov's delta-hyperbolicity in the context of random graph models, tree-decompositions, and empirical evaluation of network structure. Specifically, we characterize when random intersection graphs have bounded hyperbolicity, give general theoretical bounds on delta in terms of tree-width, and describe a relationship to local measurement of the core structure. More generally, we describe empirical results on tree-like structure in complex networks and suggest several open problems. This is joint work with various subsets of Aaron Adcock, Matthew Farrell, Timothy Goodrich, Nathan Lemons, Michael Mahoney, Michael O'Brien, and Felix Reidl.

Dane R Taylor (Statistical and Applied Mathematical Sciences Institute (SAMSI))

https://sites.google.com/site/danetaylorresearch/home

https://sites.google.com/site/danetaylorresearch/home

The study of contagion on networks is central to our understanding of collective social processes and epidemiology. However, for networks arising from an underlying manifold such as the Earth’s surface, it remains unclear the extent to which the dynamics will reflect this inherent structure, especially when long-range, “noisy” edges are present. We study the Watts threshold model (WTM) for complex contagion on noisy geometric networks – a generalization of small world networks in which nodes are embedded on a manifold. To study the extent to which contagion adheres to the manifold versus the network, which can greatly disagree on notions such as node-to-node distance, we present WTM-maps that embed the network nodes as a point cloud for which we study the topology, dimensionality, and geometry.

Let P be a finite poset. We will show that for any P-persistent
object X in the category of finite topological spaces there is a P-
weighted graph, whose weighted clique complex has the same P-persistent
homology as X

(Joint work with G.Petri (ISI Found - IMA long term visitors), A.Patania (ISI found and Politecnico di Torino) and M.Scolamiero (KTH))

(Joint work with G.Petri (ISI Found - IMA long term visitors), A.Patania (ISI found and Politecnico di Torino) and M.Scolamiero (KTH))

We discuss a number of open questions and results concerning
(from a combinatorialist's point of view) higher-dimensional
analogues of graph planarity and crossing numbers, i.e., embeddings
of finite simplicial complexes (compact polyhedra) into Euclidean space.

While embeddings are a classical topic in geometric topology, here we focus rather on algorithmic and combinatorial aspects. Two typical questions are the following: (1) Is there an algorithm that, given as input a finite k-dimensional simplical complex, decides whether it embeds in d-dimensional space? (2) What is the maximum number of k-dimensional simplices of a simplicial complex that embeds into d-dimensional space?

Time permitting, we will also discuss some maps with more general restrictions on the set of singularities, e.g., without r-fold intersection points.

The talk will be of a survey-like nature and touch upon joint work with a number of colleagues: M. Cadek, X. Goaoc, M. Krcal, I. Mabillard, J. Matousek, P. Patak, Z. Safernova, F. Sergeraert, E. Sedgwick, M. Tancer, and L. Vokrinek

While embeddings are a classical topic in geometric topology, here we focus rather on algorithmic and combinatorial aspects. Two typical questions are the following: (1) Is there an algorithm that, given as input a finite k-dimensional simplical complex, decides whether it embeds in d-dimensional space? (2) What is the maximum number of k-dimensional simplices of a simplicial complex that embeds into d-dimensional space?

Time permitting, we will also discuss some maps with more general restrictions on the set of singularities, e.g., without r-fold intersection points.

The talk will be of a survey-like nature and touch upon joint work with a number of colleagues: M. Cadek, X. Goaoc, M. Krcal, I. Mabillard, J. Matousek, P. Patak, Z. Safernova, F. Sergeraert, E. Sedgwick, M. Tancer, and L. Vokrinek

Michael Werman (Hebrew University)

www.cs.huji.ac.il/~werman

Matthew L. Wright (University of Minnesota, Twin Cities)

https://www.ima.umn.edu/~mlwright/

www.cs.huji.ac.il/~werman

Matthew L. Wright (University of Minnesota, Twin Cities)

https://www.ima.umn.edu/~mlwright/

The intrinsic volumes generalize both Euler characteristic and Lebesgue
volume, quantifying the size of a set in various ways. A random cubical
complex is a union of (possibly high-dimensional) unit cubes selected from
a lattice according to some probability model. We analyze the intrinsic
volumes of various types of random cubical complexes, obtaining polynomial
formulae for the expected value and variance of these intrinsic volumes. We
then prove an interleaving theorem about the roots of the expected
intrinsic volumes -- that is, the values of the probability parameter at
which an expected value is zero. Furthermore, we present a central limit
theorem, showing that the distribution of each intrinsic volume tends
toward a normal distribution as the size of the lattice increases towards
infinity. This work is motivated by the study of noise in digital images
and has applications in image processing.

The edge-triangle exponential random graph model has been a topic of continued research interest. We review recent developments in the study of this classic model and concentrate on the phenomenon of phase transitions. We first describe the asymptotic feature of the model along general straight lines. We show that as we continuously vary the slopes of these lines, a typical graph exhibits quantized behavior, jumping from one complete multipartite structure to another, and the jumps happen precisely at the normal lines of a polyhedral set with infinitely many facets. We then turn to exponential models where certain constraints are imposed and capture another interesting type of jump discontinuity. Based on recent joint work with Alessandro Rinaldo and Sukhada Fadnavis and current joint work in progress with Richard Kenyon.