NETWORK ROUTING ON REGULAR DIRECTED GRAPHS FROM SPANNING FACTORIZATIONS

. Networks with a high degree of symmetry are useful models for parallel processor networks. In earlier papers, we defined several global communication tasks (universal exchange, universal broadcast, universal summation) that can be critical tasks when complex algorithms are mapped to parallel machines. We showed that utilizing the symmetry can make network optimization a tractable problem. In particular, we showed that Cayley graphs have the desirable property that certain routing schemes starting from a single node can be transferred to all nodes in a way that does not introduce conflicts. In this paper, we define the concept of spanning factorizations and show that this property can also be used to transfer routing schemes from a single node to all other nodes. We show that all Cayley graphs and many (perhaps all) vertex transitive graphs have spanning factorizations .


Introduction
Networks with a high degree of symmetry are useful models for parallel processor networks. In earlier papers [5,8,9] we defined several global communication tasks (universal exchange, universal broadcast, universal summation) which can be critical tasks when complex algorithms are mapped to parallel machines. We showed that utilizing the symmetry can make network optimization a tractable problem. In particular, we showed in [9] (and earlier in [5]) that Cayley graphs have the desirable property that certain routing schemes starting from a single node can be transferred to all nodes in a way which does not introduce conflicts. In this paper, we extend this transference idea to a class of graphs that is more inclusive than Cayley graphs.

Notation for Graph Theory
This paper mainly focuses on directed graphs derived from groups. Here, a directed graph G is a set of vertices V and a collection E of ordered pairs of distinct vertices ) , ( v u called edges.
We often let n be the number of vertices and m be the number of edges. If some pair appears more than once as an edge, then G is called a multigraph. Otherwise the pairs form a set and G is called elementary. The vertex u in the pair is called the tail of the edge and the vertex v is called the head.
Definition (Cayley coset graph). Let Γ be a finite group, Η a subgroup and ∆ a subset. Suppose is a directed edge in G . The undirected graph B is bipartite and regular with degree d and so by Hall's Marriage Theorem, it can be decomposed into d 1-factors. Each of these 1-factors corresponds to a directed 1-factor in G .
In order to create a routing scheme for universal exchange (often called the transpose -see [10]) on G , we consider regular graphs with factorizations with additional properties.
Definition. Let 1 F , 2 F ,  , d F be the factors in a 1-factoring of G . We call a finite string of symbols from the set If v is a vertex and ω is a word, then ω v denotes the directed path (and its endpoint) in G starting at v and proceeding along the unique edge corresponding to each consecutive factor represented in the word ω . If G is a graph with n vertices, we say that a 1-factoring and a set of n words } , , ,

Schedules.
A schedule for universal exchange associated with a factorization is an assignment of a time (a label) to each occurrence of each factor in the words of W such that no time is assigned more than once to a particular factor and times assigned to the factors in a single word are increasing. The time of a schedule is the largest time assigned to any of the factors. If T is the total time, the schedule can be thought of as a T d × array where each row corresponds to a factor and an entry in that row indicates which occurrence of that factor has been assigned the corresponding time. An entry in a row in the array can be empty indicating no occurrence of that factor has been assigned the given time. The power of a spanning factorization lies in the fact that a schedule can be used to describe an algorithm for conflict free global exchange of information between the vertices of the graph.

BMSA Volume 20
Proof. Each edge in the graph is assigned to a single one factor. Assume there is an edge in the one factor F that has been assigned the same time twice. Since every occurrence of F in the words in W has been assigned a unique time, this can only mean that there are two different vertices u and v and an initial subword ω of a word in W such that the edges ) , (  F  u  u   ω  ω   and   ) , ( F v v ω ω are the same edge. Then ω u and ω v must be the same vertex. Let us assume that this is the shortest ω for which this happens. The word ω cannot be empty since u and v are different. But then the last factor in ω must also be the same edge, a contradiction. If we start with a spanning factorization, then all the non-empty paths from v are unique, there are 1 − n of them and none of them can return to v so they must reach to every other vertex in the graph.
There are some additional properties that a spanning factorization with word list W might have.
Definition. We say a spanning factorization is balanced if each factor appears nearly as often in the schedule as any other. We say the factorization is short if the average number of times a factor appears is the same as the theoretical lower bound θ based on the average distance between any two vertices and the number of edges. We say the factorization is optimal if it is short and balanced.
A schedule Σ is minimum for a spanning factorization, if it has time ) (Σ τ equal to the theoretical minimum time for the factorization based on maximum number of times a factor appears. In mathematical terms, we can write N is the number of times the distance between two vertices is k and D is the diameter); Note that these parameters are ordered Creation of schedules for spanning factorizations are discussed in [11], where the following is proven.

Fact 2.
Every diameter two spanning factorization has a minimum schedule unless the max belongs to a factor i F which is not in a word of length one and is entirely absent in words of length two in one position, either first or second. In that case, the shortest time for any schedule is one more than the theoretical minimum.
Universal broadcast. This paper concerns universal exchange (transpose). Employing these ideas for universal broadcast requires more restrictions on the list of words. In a universal broadcast, instead of sending a different piece of information to all other vertices, a vertex has one single piece of information to send to all others. To utilize this communication pattern, we impose an additional condition on the words in the list W . We say that the list W is hierarchical if every initial subword of a word in W is also a word in W . In addition, given a hierarchical list for universal broadcast, the list is thought of as a tree and each edge in the tree is labeled with a time only once. The problem of assigning an optimal schedule is greatly simplified because all that is needed is for the times on a factor to form a partial order. It still may be a difficult problem to find which is the best tree to use.

Cayley coset graphs.
Our main goal is to find spanning factorizations for Cayley coset graphs. If the graph is a Cayley graph, this is easy.

Theorem 2. Every Cayley graph has a short factorization.
Proof. This is a sketch. Take a tree 1 T of shortest paths from the identity of the group. The factors consist of all the edges labeled with a specific generator. The words are just the paths in 1 T , so the factorization is short.

Question. Does every Cayley coset graph have a spanning factorization or even a short spanning factorization?
Example 1 -CP graphs. CP graphs ) , ( D d G are vertex symmetric digraphs with a large number of vertices for a given degree d and diameter D. They were first introduced in [6] and [7]. Many of the properties that make them desirable for multiprocessor networks have been studied in [1], [3] and [4]. In particular, [4] constructs broadcast trees for CP graphs which are related to our factorization. In [14], it is determined which CP graphs are Cayley graphs and thus these have a short factorization. We can show that all CP graphs have a short factorization. To describe the edges, we use the notation in [2]. Let be an arbitrary vertex of G . We know that the outward neighbors of x are of two types: We let ,..., , 3 2 , that is,  [7]. Now consider the vertices given by xw , W w∈ . Let α be the permutation defined by Proof. In [7], the number of vertices given as ( G has 6 vertices, degree 2 and the sum of the distances from one vertex to all of the others is 8 so the theoretical minimum time is 4 not 5. The theoretical minimum cannot be achieved by using the unique shortest paths since they have 30 2 F edges and at most 6 can be used at any one time. Replacing even one shortest path by a longer path will increase the lower bound on the time. Now we can produce a minimum schedule for universal exchange using the factorization in Theorem 3 for 3 > D . We start by looking at the usage of each j F in the factorization. To this end, we define a recursion that grows the tree of unique shortest paths from any vertex v .

Bulletin of Mathematical Sciences and Applications Vol. 20 13
Proof. This is just another expression of the algorithm in Theorem 3.5 in [7].
We often refer to first entry in ) , ( t c as the c label and the second as the t label. The t label denotes the distance from the vertex v while the c label can be thought of as keeping track of the number of remaining uses we are allowed for factors with small indices. Intuitively, these factors are a limited resource because they consist of short cycles and reusing them too often gives a path that does not increase distance to v .
Proof. This is clearly true for . Also, moving out along the edges, the c labels either remain the same or decrease by 1 and there is always at least one case where the c label decreases. Suppose we consider the vertex labeled ) , and an edge belonging to the cycle j F going out to a vertex assigned ) Question. How many times does j F appear in paths of length k for D k ≤ ?
be the number of paths of length k using j F as the i th step. We can calculate ) 1 , Once we have T , we show a recursion in Lemma 13 that can be used to calculate S , We calculate ) , ( t c T recursively.
. We are not allowed to reduce c so we just have the j with by the induction hypothesis. But this is exactly Proof. This is just a restatement of Lemma 6 and the fact that by definition, the tree starting at a vertex v labeled ) , ( k c has v as its only leaf.

Lemma 9. For
Proof. We prove this by induction, starting with k t = and working backward.
and thus this is a valid statement. Now we assume that k t < and that both statements are valid for 1 + t . We break the proof that the statements are valid for t up into cases.
First, suppose that 1 = c . Then we have to verify statement (ii). We have are known from case (i) of the induction hypothesis. Using Lemma 9, we can compute is known from case (i). Using Lemma 9, we compute which matches statement (ii). This concludes the induction step and proves the Lemma.

Bulletin of Mathematical Sciences and Applications Vol. 20 15
Next we calculate two other auxiliary quantities. Let Proof. By the initialization step in Theorem 5, . The recursion part of Theorem 5 then yields Furthermore, this recursion and the initial condition are satisfied by Proof. Since the presence of the factor j F depends only on the label ) Proof. First we mentioned above Lemma 7 that , the number of paths of length k using j F as the t th step, ) , ( t j S k , can be described (using the convention that the falling factorial is zero whenever the argument or the index is out of bounds) as follows: Proof. First, we expand the statement by eliminating the conventions. The equivalent statement has these cases

Bulletin of Mathematical Sciences and Applications Vol. 20
Now we can prove each of the cases. We have seen that Now we show that the summation on the far right satisfies We use the well-known combinatorial identity This has a falling factorial analog given by This gives us Finally, the value of This proves statement 2d. Lemma 15. At every level, cycles with larger indices appear more often than cycles with smaller indices, that is, for Proof. For 1 = t , this is clearly true by Theorem 14 part 1. If 1 > t , we notice that Lemma 13 gives As in Theorem 14, there are cases to consider from Lemma 10. When ) the difference is clearly non-negative. When appear more than once on a given factor and that the times are partially ordered by layer.

Discussion
This expression can be manipulated to get By the results in [7], we know that To show that this collection forms a spanning factorization, we need only show that all the i vω with W i ∈ ω are different. To start out with, it is clear that words of the first and second type produce vertices which are distinct from the vertices produced by the third type. Also note that words of the first and second type also produce different vertices because the first type fixes the first coordinate while the second type does not. Some simple algebra shows that each of the members of each type produce vertices which are distinct from those produced by members of the same type. Finally, we have to show that no word with a single factor produces a vertex which is identical to that produced by a word with two factors. This is clearly true when comparing words of different parities, crossover and fix-r . Note that any single fix-r factor preserves the first coordinate so it cannot match the result of two cross-over factors. In order to find the time for the minimum schedule, we have to calculate the number of times each factor appears. A fix-r factor appears on the left in a word once with another fix-r factor, q times with a cross-over factor and once by itself. A fix-r factor appears on the right once with another fix-r and q times with a cross-over factor for a total of 3 2 + q times. A cross-over factor appears on the left in a word 1 − q times with another cross-over factor, and it is easy to see that the subgroup H has order ) 1 ( − q q so the coset graph has the right number of vertices. We can also see that from the relations above that both H . This is easy to verify by direct computation using the definitions of c , y and h .