# Structure of Leavitt path algebras of polynomial growth

See allHide authors and affiliations

Contributed by Efim I. Zelmanov, June 12, 2013 (sent for review May 26, 2013)

## Abstract

We determine the structure of Leavitt path algebras of polynomial growth and discuss their automorphisms and involutions.

Following the works in refs. 1⇓⇓–4, Abrams and Aranda Pino (5) and Ara et al. (6) introduced Leavitt path algebras of directed graphs as algebraic analogs of C* algebras of Cuntz and Krieger. This construction provided a rich supply of finitely presented algebras having interesting and extreme properties.

Let be a finite directed graph with the set of vertices *V* and the set of edges *E*. For an edge we let and denote its source and range, respectively. A vertex *v* for which is empty is called a *sink*. A *path* in a graph is a sequence of edges such that . In this case we say that the path *p* starts at the vertex and ends at the vertex . If then the path is closed. If is a closed path and the vertices are distinct, then the subgraph of the graph is called a cycle.

Let be a finite graph and let *F* be a field. The Leavitt path *F* algebra is the *F* algebra presented by the set of generators and the set of relations (*i*) for all (*ii*) for all (*iii*) for all and (*iv*) for an arbitrary vertex . The mapping that sends *v* to *v*, *e* to , and to *e*, , extends to an involution of the algebra . If is a path, then .

In ref. 7 we showed that the algebra has polynomial growth if and only if no two cycles of intersect. Let , and let . For an algebra *R*, let denote the algebra of matrices over *R* and let denote the algebra of infinite finitary matrices over *R*, that is, infinite matrices with only finitely many nonzero entries.

**Theorem 1.** *Let* *be a Leavitt path algebra of polynomial growth. Then* *has a finite chain of ideals*, , *such that* *is a finite sum of matrix algebras and infinite finitary matrix algebras over F and each factor* , *is a finite sum of matrix algebras and finitary matrix algebras over the Laurent polynomial algebra* *The ideals* *are invariant under Aut* .

** Remark 1:** We will show that is the locally finite radical of (8).

In the rest of the paper we study the algebraic Toeplitz algebra (9) as the simplest nontrivial example of a Leavitt path algebra of polynomial growth. As shown in ref. 9 (it follows also from *Theorem 1* above) the locally finite radical of is and .

**Theorem 2.** *The short exact sequence* *does not split.*

The significance of *Theorem 2* is that it shows that the extensions in *Theorem 1*, generally speaking, do not split.

We describe automorphisms and involutions of the algebraic Toeplitz algebra . Description of involutions is related to the question of whether isomorphic Leavitt path algebras are isomorphic as involutive algebras (10).

**Theorem 3.** , *a semidirect product of the multiplicative group* *of the field F with the general linear finitary group* . *If* *then the only involution on* *(up to isomorphism) is the standard involution* .

In what follows we will assume that the finite graph does not have distinct intersecting cycles, which guarantees that has polynomial growth. For an arbitrary path *p*, the element is an idempotent. Consider the family of idempotents .

** Remark 2**: We view vertices as paths of length 0.

For two idempotents , if neither *p* nor *q* is an initial subpath of the other, then *e* and *f* are orthogonal. If then .

Consider the set of vertices . The subset is hereditary and saturated (5). Hence, the ideal is the *F* span of all products , where *q* are paths, . Let . Because it follows that . Consider also the set of idempotents . We call idempotents from minimal. Let be all sinks of . Let . Clearly, and if .

**Lemma 4** (6).

*i*)*Every idempotent from**is a sum of minimal idempotents*,*ii*)*if**then**iii*)*if*,*then*.

The set is infinite if and only if there exists a cycle from which one can get to . In that case *Lemma 4* implies that . Otherwise , where *k* is the number of paths that end at . We thus have proved that is isomorphic to a finite sum of matrix algebras and infinite finitary matrix algebras over *F*.

Recall that an algebra is said to be *locally finite dimensional* if every finitely generated subalgebra of it is finite dimensional. The sum of all locally finite-dimensional ideals of an associative algebra *A* is a *locally finite-dimensional ideal*, which is called the *locally finite-dimensional radical,* denoted by . For further properties of , see ref. 8.

**Lemma 5.** .

The ideal is also the socle of the algebra (11).

As shown in ref. 5 , where ; the graph does not have sinks. Without loss of generality consider therefore a finite graph such that and does not have sinks, so .

Recall that an edge *e* is called an exit from a cycle *C* if lies on *C*, but *e* is not a part of *C* (5). A cycle without exits will be referred to as an *NE* cycle. For an arbitrary vertex there exists a path that starts at *v* and ends on a cycle, otherwise , which contradicts our assumption. Moreover, because distinct cycles of do not intersect and all chains of cycles (7) are finite, it follows that for an arbitrary there exists a path that starts at *v* and ends on an *NE* cycle.

Consider the set . The set is obviously hereditary and saturated. Let be all *NE* cycles of , . Clearly, if . We define .

Consider an *NE* cycle with vertices. In ref. 12 it is shown that the subalgebra *L*(*C*_{i}) = span(*pq**|*p*, *q* are both paths on the cycle *C*_{i}) is isomorphic to .

**Lemma 6.** *Let* . *Then* .

If the set is infinite, which happens if there exists a cycle *C* different from and a path *p* such that , , then by *Lemma 6* . If then .

We have proved that the algebra *J* is isomorphic to a finite direct sum of matrix algebras and infinite finitary matrix algebras over . The ideal of the algebra has been defined. For define via . We have got an ascending chain claimed in *Theorem 1*. The ideal is invariant under Aut . To prove that the ideals , are invariant we need to obtain an abstract characterization of the ideal

**Lemma 7.** *The ideal J is the largest ideal of* *with the property that for an arbitrary element* , *and an arbitrary finite-dimensional subspace G of* *that generates* , *there exists a positive constant* *such that* *for* .

**Corollary 8.** *Let* *and* *be finite graphs, and suppose that* *is an isomorphism and that* *has polynomial growth. Then* .

*Theorem 1* is proved.

We determined the factors , but the nature of extensions remains unclear. *Theorem 2* implies that generally speaking they do not split.

The algebra can be presented by generators and relators as (13). Let us fix the notation. The element is an idempotent. We have

Suppose that the extension splits, that is, the algebra *A* contains a subalgebra *B*, which is isomorphic to , . Let , where , ; , . Consider the finite sets , and one-sided ideal , .

**Lemma 9.** *For an arbitrary element* *there exists* *such that* , , *for any* .

*Proof*: Choose an element . We define . Because for , , it follows that or . This implies the first inclusion. The second inclusion is proved in the same way. This completes the proof of the Lemma. ∎

**Lemma 10.** *For an arbitrary element* *we have* .

*Proof*: Let . For arbitrary integers we have . Notice that . Fix . It follows from the above that there exists a nonzero polynomial such that Because every nonzero ideal of the algebra is of finite codimension, we conclude that . The element is the identity of the algebra *B*. Notice that . Otherwise *IB* is a nilpotent left ideal of the algebra *I*, which implies that , is a direct sum. However, the algebra is not finitely generated, a contradiction. In view of the simplicity of the algebra *I*, the subset generates *I* as an ideal. This completes the proof of the Lemma.

**Lemma 11.** *For an arbitrary element* *we have* .

*Proof*: Let denote the sum and let denote the sum and . Let and let . We claim that efor each element and for an arbitrary product *b* of elements , of length we have , where are products of lements , of length . Indeed, consider the ascending chain of subspaces , where is the *F* span of all products of elements , of length . Because we cannot have a strict inclusion at every step. Hence , as claimed. Every product of elements *y* is a linear combination of products of , , , . Let *w* be a product of elements , , , . Then *aw* can be represented as , where are products of *a*, , are products of , . Because of the presence of the element *a* at the left end the word is not empty. The claim above implies that the words can be assumed to have lengths . Now *aw* lies in the subalgebra of generated by , , where elements *b* are products in , of lengths . This subalgebra is finitely generated, hence finite dimensional. This completes the proof of the Lemma. ∎

It is well known that the set is a basis of *A*. Hence the elements , , are linearly independent, , a contradiction. *Theorem 2* is proved.

Now our aim is description of automorphisms and involutions of the algebra , .

Consider the countably infinite-dimensional vector space . Let *E* be the algebra of all linear transformations of *V*. Because the basis has been fixed we can identify *E* with the algebra of matrices having only finitely many nonzero entries in each column. Consider also the subalgebra of *E* which consists of matrices having finitely many nonzero entries in each row and in each column. As above, is the algebra of finitary (having finitely many nonzero entries) matrices. It is easy to see that is an ideal in and a left ideal in *E*.

As follows from *Theorem 1*, the ideal is isomorphic to . Extending this isomorphism we can embed into the algebra , the cycle *c* and its conjugate are identified with the matrices , , respectively, are matrix units, .

**Theorem 12.** *(Jacobson, ref.* 14*). For an arbitrary automorphism φ of* there exists an invertible element such that for any .

**Lemma 13.** *An automorphism of* *induces an automorphism of the type* , , .

*Proof*: If the assertion is not true then there exists an automorphism φ of whose image in Aut maps *t* to . By Jacobson’s theorem there exists an invertible element such that for all . In particular, , . Hence, , . This implies that for a sufficiently large we have provided that . Therefore, . We showed that for . The *j*th column of the matrix *T* intersects all diagonals , . Hence if the sequence , , contains infinitely many nonzero entries then every column of *T* contains infinitely many nonzero entries. Hence the matrix *T* is finitary, a contradiction. This completes the proof of the Lemma. ∎

**Lemma 14.** *If* *is invertible and* *then* .

Recall that the group of invertible matrices from is called the finitary general linear group (15). It can be realized as the union .

**Lemma 15.** *Let* , , , *for any* . *Then* , , .

*Proof:* By our assumptions , , or, equivalently, , . Hence for a sufficiently large , provided that (we assume that ). Hence *T* is an almost Toeplitz matrix, , where , , . The *j*th column intersects all diagonals with . Hence the set is finite. Similarly, an *i*th row intersects all diagonals with . Hence the set is finite as well. Now we have ; . Because the matrix cannot be strictly upper or lower triangular (otherwise *T* would not be invertible), we can assume that *m*, . All of the above applies to the matrix as well, ; ; *p*, , is a finitary matrix. Now

Because *T*, it follows that . Moreover, the equality above implies that , . This completes the proof of the Lemma, and thus completes the proof of *Theorem 3*. ∎

Consider the embedding of the multiplicative group of the field *F* into the multiplicative group of the algebra , . It is easy to see that and . Now, *Lemmas 13* and *15* imply that .

We say that two involutive algebras and are isomorphic if there exists an isomorphism of algebras , , such that for an arbitrary element .

**Lemma 16.** *Let* *Then the algebra* *has only one (up to isomorphism) involution: the standard involution* *.*

*Proof*: If we view as a subalgebra of the algebra , then the standard involution becomes the restriction of the transposition . be an involution. The composition of the involutions − and *t* is an automorphism. Hence there exists a matrix such that for all elements , . Applying the involution − twice we get . Because the matrix commutes with an arbitrary matrix from it follows that , , . Now, , . All nonzero entries of the matrix *T* except finitely many lie in the main diagonal. Hence *T* cannot be skew-symmetric. Hence . If an arbitrary element from *F* is a square then there exists a matrix such that . Now the mapping is an isomorphism of the involutive algebra to the involutive algebra . This completes the proof of the Lemma. ∎

### Note Added in Proof.

For a different approach to automorphisms of the Jacobson algebra, see ref. 16.

## Acknowledgments

The authors thank G. Abrams and J. Bell for numerous helpful remarks. This research is supported by the Deanship of Scientific Research, King Abdulaziz University. The work is also partially supported by the National Science Foundation.

## Footnotes

- ↵
^{1}To whom correspondence should be addressed. E-mail: ezelmano{at}math.ucsd.edu.

Author contributions: A.A., H.A., S.J., and E.I.Z. designed research, performed research, and wrote the paper.

The authors declare no conflict of interest.

## References

- ↵
- ↵
- ↵
- ↵Raeburn I (2005) Graph algebras.
*CBMS Regional Conference Series in Mathematics*103 (American Mathematical Society, Providence, RI). - ↵
- ↵
- ↵
- Alahmadi A,
- Alsulami H,
- Jain SK,
- Zelmanov E

- ↵
- Zhevlakov KA,
- Shestakov IP

- ↵
- ↵
- ↵
- Aranda Pino G,
- Martin Barquero D,
- Martin Gonzalez C,
- Siles Molina M

- ↵
- Abrams G,
- Aranda Pino G,
- Perera F,
- Siles Molina M

- ↵
- ↵Jacobson N (1956) Structure of rings.
*AMS Coll Publ*37 (American Mathematical Society, Providence, RI). - ↵
- ↵

## Citation Manager Formats

## Article Classifications

- Physical Sciences
- Mathematics