2.5.3 The Differential Graded Nerve
We now explain how to associate to each differential graded category $\operatorname{\mathcal{C}}$ an $\infty $-category $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$, which we will refer to as the differential graded nerve of $\operatorname{\mathcal{C}}$. We begin by describing the simplices of $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$.
Construction 2.5.3.1. Let $\operatorname{\mathcal{C}}$ be a differential graded category. For $n \geq 0$, we let $\operatorname{N}_{n}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ denote the collection of all ordered pairs $( \{ X_ i \} _{0 \leq i \leq n}, \{ f_ I \} )$, where:
Each $X_ i$ is an object of the differential graded category $\operatorname{\mathcal{C}}$.
For every subset $I = \{ i_0 > i_{1} > \cdots > i_ k \} \subseteq [n]$ having at least two elements, $f_{I}$ is an element of the abelian group $\operatorname{Hom}_{\operatorname{\mathcal{C}}}( X_{i_{k}}, X_{i_0} )_{k-1}$ which satisfies the identity
\begin{eqnarray*} \partial f_{I} = \sum _{a=1}^{k-1} (-1)^{a} ( f_{ \{ i_0 > i_1 > \cdots > i_ a \} } \circ f_{ \{ i_ a > \cdots > i_ k \} } - f_{I \setminus \{ i_ a \} } ) \end{eqnarray*}
Example 2.5.3.2 (Vertices of the Differential Graded Nerve). Let $\operatorname{\mathcal{C}}$ be a differential graded category. Then $\operatorname{N}_{0}^{\operatorname{dg}}( \operatorname{\mathcal{C}})$ can be identified with the collection $\operatorname{Ob}(\operatorname{\mathcal{C}})$ of objects of $\operatorname{\mathcal{C}}$.
Example 2.5.3.3 (Edges of the Differential Graded Nerve). Let $\operatorname{\mathcal{C}}$ be a differential graded category. Then $\operatorname{N}_{1}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ can be identified with the collection of all triples $(X_0, X_1, f)$ where $X_0$ and $X_1$ are objects of $\operatorname{\mathcal{C}}$ and $f$ is a $0$-cycle in the chain complex $\operatorname{Hom}_{\operatorname{\mathcal{C}}}( X_0, X_1)_{\bullet }$. In other words, $\operatorname{N}_{1}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ is the collection of all morphisms in the underlying category $\operatorname{\mathcal{C}}^{\circ }$ of Construction 2.5.2.4.
Example 2.5.3.4 ($2$-Simplices of the Differential Graded Nerve). Let $\operatorname{\mathcal{C}}$ be a differential graded category. Then an element of $\operatorname{N}_{2}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ is given by the following data:
A triple of objects $X_{0}, X_1, X_2 \in \operatorname{Ob}(\operatorname{\mathcal{C}})$.
A triple of $0$-cycles
\[ f_{10} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_1)_{0} \quad \quad f_{20} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_2)_{0} \quad \quad f_{21} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_1, X_2)_{0}. \]
A $1$-chain $f_{210} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_2)_{1}$ satisfying the identity
\[ \partial (f_{210}) = f_{20} - (f_{21} \circ f_{10}). \]
Here the $1$-chain $f_{210}$ can be regarded as a witness to the assertion that the $0$-cycles $f_{20}$ and $f_{21} \circ f_{10}$ are homologous: that is, they represent the same element of the homology group $\mathrm{H}_0( \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_2) )$. We can present this data graphically by the diagram
\[ \xymatrix@C =50pt@R=50pt{ & X_1 \ar [dr]^{f_{21}} \ar@ {=>}[]+<0pt,-15pt>;+<0pt,-60pt>^-{f_{210}} & \\ X_0 \ar [ur]^{f_{10}} \ar [rr]_{f_{20}} & & X_2. } \]
We now explain how to organize the collection $\{ \operatorname{N}_{n}^{\operatorname{dg}}( \operatorname{\mathcal{C}}) \} _{n \geq 0}$ into a simplicial set.
Proposition 2.5.3.5. Let $\operatorname{\mathcal{C}}$ be a differential graded category. Let $m, n \geq 0$ be nonnegative integers and let $\alpha : [n] \rightarrow [m]$ be a nondecreasing function. Then the construction
\[ ( \{ X_ i \} _{0 \leq i \leq m}, \{ f_{I} \} ) \mapsto ( \{ X_{ \alpha (j) } \} _{ 0 \leq j \leq n}, \{ g_{J} \} ), \]
\[ g_{J} = \begin{cases} f_{ \alpha (J) } & \textnormal{ if } \alpha |_{J} \textnormal{ is injective } \\ \operatorname{id}_{ X_ i } & \textnormal{ if } J = \{ j_{0} > j_{1} \} \textnormal{ with } \alpha (j_{0}) = i = \alpha (j_{1}) \\ 0 & \textnormal{ otherwise. } \end{cases} \]
determines a map of sets $\alpha ^{\ast }: \operatorname{N}_{m}^{\operatorname{dg}}(\operatorname{\mathcal{C}}) \rightarrow \operatorname{N}_{n}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$.
Proof.
Let $( \{ X_ i \} _{0 \leq i \leq m}, \{ f_ I \} )$ be an element of $\operatorname{N}_{m}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$. For each subset $J \subseteq [n]$ with at least two elements, define $g_{J}$ as in the statement of Proposition 2.5.3.5. We wish to show that $( \{ X_{ \alpha (j) } \} _{ 0 \leq j \leq n}, \{ g_{J} \} )$ is an element of $\operatorname{N}_{n}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$. For this, we must show that for each subset
\[ J = \{ j_{0} > j_{1} > \cdots > j_{k-1} > j_{k} \} \subseteq [n] \]
having at least two elements, we have an equality
2.27
\begin{eqnarray} \label{equation:functoriality-of-dg-nerve} \partial g_{J} = \sum _{0 < a < k} (-1)^{a} ( g_{ \{ j_0 > j_1 > \cdots > j_ a \} } \circ g_{ \{ j_ a > \cdots > j_ k \} } - g_{J \setminus \{ j_ a \} } ). \end{eqnarray}
We distinguish three cases:
Suppose that the restriction $\alpha |_{J}$ is injective. In this case, we can rewrite (2.27) as an equality
\begin{eqnarray*} \partial f_{\alpha (J)} = \sum _{0 < a < k} (-1)^{a} ( f_{ \{ \alpha (j_0) > \cdots > \alpha (j_ a) \} } \circ f_{ \{ \alpha (j_ a) > \cdots > \alpha (j_ k) \} } - f_{\alpha (J) \setminus \{ \alpha (j_ a) \} } ), \end{eqnarray*}
which follows from our assumption that $( \{ X_ i \} _{0 \leq i \leq m}, \{ f_ I \} )$ is an element of $\operatorname{N}_{m}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$.
Suppose that $J = \{ j_{0} > j_{1} \} $ is a two-element set satisfying $\alpha (j_{0}) = i = \alpha (j_{1} )$ for some $0 \leq i \leq m$. In this case, we can rewrite (2.27) as an equality $\partial (\operatorname{id}_{ X_ i}) = 0$, which follows from Remark 2.5.2.2.
Suppose that $J = \{ j_{0} > j_{1} > \cdots > j_{k-1} > j_{k} \} $ has at least three elements and that $\alpha |_{J}$ is not injective, so that $g_{J} = 0$. We now distinguish three (possibly overlapping) cases:
The map $\alpha $ is not injective because $\alpha (j_{0}) = i = \alpha ( j_{1} )$ for some $0 \leq i \leq m$. In this case, the expressions $g_{ J \setminus \{ j_ a \} }$ and $g_{ \{ j_{0} > \cdots > j_{a} \} }$ vanish for $1 < a < k$. We can therefore rewrite (2.27) as an an equality
\[ g_{J \setminus \{ j_1 \} } = g_{ \{ j_0 > j_1 \} } \circ g_{ \{ j_{1} > \cdots > j_{k} \} }, \]
which follows from the identities $g_{ J \setminus \{ j_1 \} } = g_{ \{ j_1 > \cdots > j_ k \} }$ and $g_{ \{ j_0 > j_1 \} } = \operatorname{id}_{ X_ i }$.
The map $\alpha $ is not injective because $\alpha (j_{k-1} ) = i = \alpha ( j_{k} )$ for some $0 \leq i \leq m$. In this case, the expressions $g_{J \setminus \{ j_ a \} }$ and $g_{ \{ j_{a} > \cdots > j_ k \} }$ vanish for $0 < a < k-1$. We can therefore rewrite (2.27) as an an equality
\[ g_{J \setminus \{ j_{k-1} \} } = g_{ \{ j_0 > \cdots > j_{k-1} \} } \circ g_{ \{ j_{k-1} > j_ k \} }, \]
which follows from the identities $g_{J \setminus \{ j_{k-1} \} } = g_{ \{ j_{0} > \cdots > j_{k-1} \} }$ and $g_{ \{ j_{k-1} > j_{k} \} } = \operatorname{id}_{ X_ i }$.
The map $\alpha $ is not injective because we have $\alpha ( j_{b} ) = \alpha ( j_{b+1} )$ for some $0 < b < k-1$. In this case, the chains $g_{ J \setminus \{ j_{a} \} }$ vanish for $a \notin \{ b, b+1\} $, and the compositions $g_{ \{ j_0 > \cdots > j_ a \} } \circ g_{ \{ j_ a > \cdots > j_ k \} }$ vanish for all $0 < a < k$. We can therefore rewrite (2.27) as an an equality $g_{J \setminus \{ j_ b \} } = g_{ J \setminus \{ j_{b+1} \} }$, which is clear.
$\square$
Exercise 2.5.3.6. Let $\operatorname{\mathcal{C}}$ be a differential graded category. Suppose we are given a pair of nondecreasing functions $\alpha : [k] \rightarrow [m]$ and $\beta : [m] \rightarrow [n]$. Show that the function $(\beta \circ \alpha )^{\ast }$ of Proposition 2.5.3.5 coincides with the composition $\alpha ^{\ast } \circ \beta ^{\ast }$.
Definition 2.5.3.7. Let $\operatorname{\mathcal{C}}$ be a differential graded category. We let $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ denote the simplicial set whose value on an object $[n] \in \operatorname{{\bf \Delta }}^{\operatorname{op}}$ is the set $\operatorname{N}_{n}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ of Construction 2.5.3.1, and whose value on a nondecreasing function $\alpha : [n] \rightarrow [m]$ is the function $\alpha ^{\ast }: \operatorname{N}_{m}^{\operatorname{dg}}(\operatorname{\mathcal{C}}) \rightarrow \operatorname{N}_{n}^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ of Proposition 2.5.3.5. We will refer to $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ as the differential graded nerve of $\operatorname{\mathcal{C}}$.
Moreover, this data must satisfy the following conditions:
If $e$ is a degenerate edge of $K_{\bullet }$ connecting a vertex $x$ to itself, then $f(e)$ is the identity morphism $\operatorname{id}_{f(x)} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}( f(x), f(x) )_{0}$.
If $\sigma $ is a degenerate simplex of $K_{\bullet }$ having dimension $\geq 2$, then $f(\sigma ) = 0$.
Let $k > 0$ and let $\sigma : \Delta ^ k \rightarrow K_{\bullet }$ be an $k$-simplex of $K_{\bullet }$. For $0 < b < k$, let $\sigma _{\leq b}: \Delta ^{b} \hookrightarrow K_{\bullet }$ denote the composition of $\sigma $ with the inclusion map $\Delta ^{b} \hookrightarrow \Delta ^{k}$ (which is the identity on vertices), and let $\sigma _{\geq b}: \Delta ^{k-b} \hookrightarrow K_{\bullet }$ denote the composition of $\sigma $ with the map $\Delta ^{k-b} \hookrightarrow \Delta ^{k}$ given on vertices by $i \mapsto i+b$. Then we have
\begin{eqnarray*} \partial f(\sigma ) = \sum _{b=1}^{k-1} (-1)^{k-b} (f( \sigma _{\geq b}) \circ f( \sigma _{\leq b} ) - f( d^{k}_ b \sigma ) ) \end{eqnarray*}
Theorem 2.5.3.10. Let $\operatorname{\mathcal{C}}$ be a differential graded category. Then the simplicial set $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$ is an $\infty $-category.
Proof.
Suppose we are given $0 < j < n$ and a map of simplicial sets $\sigma _0: \Lambda ^{n}_{j} \rightarrow \operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$. Using Remark 2.5.3.9, we see that $\sigma _0$ can be identified with the data of a pair $( \{ X_ i \} _{0 \leq i \leq n}, \{ f_ I \} )$, where $\{ X_ i \} _{0 \leq i \leq n}$ is a collection of objects of $\operatorname{\mathcal{C}}$ and $f_{I} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}( X_{i_0}, X_{i_ k} )_{k-1}$ is defined for every subset $I = \{ i_0 > i_{1} > \cdots > i_{k} \} \subseteq [n]$ for which $k > 0$ and $[n] \neq I \neq [n] \setminus \{ j\} $, satisfying the identity
2.30
\begin{eqnarray} \label{equation:checking-dg-nerve-is-infty-category} \partial f_{I} = \sum _{a=1}^{k-1} (-1)^{a} ( f_{ \{ i_0 > i_1 > \cdots > i_ a \} } \circ f_{ \{ i_ a > \cdots > i_ k \} } - f_{I \setminus \{ i_ a \} } ) \end{eqnarray}
We wish to show that $\sigma _0$ can be extended to an $n$-simplex of $\operatorname{N}_{\bullet }^{\operatorname{dg}}(\operatorname{\mathcal{C}})$. To give such an extension, we must supply chains $f_{[n]} \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_ n)_{n-1}$ and $f_{[n] \setminus \{ j\} } \in \operatorname{Hom}_{\operatorname{\mathcal{C}}}(X_0, X_ n)_{n-2}$ which satisfy (2.30) in the cases $I = [n]$ and $I = [n] \setminus \{ j\} $. We claim that there is a unique such extension which also satisfies $f_{[n]} = 0$. Applying (2.30) in the case $I = [n]$, we deduce that $f_{[n] \setminus \{ j\} }$ is necessarily given by
\[ f_{ [n] \setminus \{ j\} } = \sum _{ 0 < b < n} (-1)^{b-j} (f_{ \{ n > \cdots > b \} } \circ f_{ \{ b > \cdots > 0\} }) - \sum _{ 0 < b < n, b \neq j} (-1)^{b-j} f_{[n] \setminus \{ b\} }. \]
To complete the proof, it will suffice to verify that this prescription also satisfies (2.30) in the case $I = [n] \setminus \{ j\} $. In what follows, for $0 \leq a < b \leq n$, let us write $[ba]$ for the set $\{ b > b-1 > \cdots > a \} $. We now compute
\begin{eqnarray*} (-1)^{j} \partial f_{[n] \setminus \{ j\} } & = & \sum _{ 0 < b < n} (-1)^{b} \partial (f_{ [nb]} f_{[b0]}) - \sum _{ 0 < b < n, b \neq j} (-1)^{b} \partial f_{[n] \setminus \{ b\} } \\ & = & \sum _{0 < b < n} (-1)^{b} (\partial f_{[nb]}) f_{[b]} - \sum _{0 < b < n} (-1)^{n} f_{[nb]} (\partial f_{[b]}) \\ & & - \sum _{ 0 < b < n, b \neq j} (-1)^{b} \partial f_{ [n] \setminus \{ b\} } \\ & = & \sum _{ 0 < b < c < n} (-1)^{n-c+b} f_{ [nc] } f_{[cb]} f_{[b0] } - \sum _{ 0 < b < c < n} (-1)^{n-c+b} (f_{[nb] \setminus \{ c\} } f_{[b]}) - \\ & & \sum _{ 0 < a < b < n } (-1)^{n+b-a} f_{[nb]} f_{[ba]} f_{[a0]} + \sum _{ 0 < a < b < n } (-1)^{n+b-a} f_{[nb]} f_{[b0] \setminus \{ a\} } - \\ & & \sum _{ 0 < b < c < n, b \neq j } (-1)^{b+n-c} f_{[nc]} f_{[c0] \setminus \{ b\} } + \sum _{ 0 < b < c < n, b \neq j } (-1)^{b+n-c} f_{[n0] \setminus \{ b,c\} } + \\ & & \sum _{ 0 < a < b < n, b \neq j } (-1)^{b+n-a} f_{[na] \setminus \{ b\} } f_{[a0]} - \sum _{ 0 < a < b < n, b \neq j } (-1)^{b+n-a} f_{[n0] \setminus \{ a,b\} }. \end{eqnarray*}
Here the first and third terms cancel, the seventh term cancels with the second except for those summands with $c=j$, the fifth term cancels with the fourth except for those summands with $a = j$, and the sixth term cancels the eighth except for those terms with $c = j$ and $a = j$, respectively. Multiplying by $(-1)^{j}$, we can rewrite this identity as
\begin{eqnarray*} \partial f_{[n] \setminus \{ j\} } & = & \sum _{ 0 < b < j } (-1)^{n-1-b} (f_{ [nb] \setminus \{ j \} } \circ f_{ [b0] }) + \sum _{ j < b < n} (-1)^{n-b} (f_{[nb]} \circ f_{[b0] \setminus \{ j\} } ) \\ & & - \sum _{0 < b < j} (-1)^{n-1-b} f_{[n] \setminus \{ b,j\} } - \sum _{ j < b < n } (-1)^{n-b} f_{ [n] \setminus \{ b,j \} }, \end{eqnarray*}
which recovers equation (2.30) in the case $I = [n] \setminus \{ j\} $.
$\square$