\documentclass[12pt,reqno]{article}
\usepackage[usenames]{color}
\usepackage{amssymb}
\usepackage{graphicx}
\usepackage{amscd}
\usepackage[colorlinks=true,
linkcolor=webgreen,
filecolor=webbrown,
citecolor=webgreen]{hyperref}
\definecolor{webgreen}{rgb}{0,.5,0}
\definecolor{webbrown}{rgb}{.6,0,0}
\usepackage{color}
\usepackage{fullpage}
\usepackage{float}
\usepackage{psfig}
\usepackage{graphics,amsmath,amssymb}
\usepackage{amsthm}
\usepackage{amsfonts}
\usepackage{latexsym}
\usepackage{epsf}
\usepackage{xspace}
\usepackage{enumerate}
\usepackage{algorithm}
%\usepackage[dvips]{graphicx}
\usepackage[left=2.54cm,right=2.54cm]{geometry}
\usepackage{emptypage}
\usepackage[usenames,svgnames]{xcolor}
\usepackage{pgf}
\usepackage{tikz}
\usetikzlibrary{arrows,automata}
\usetikzlibrary{decorations.markings}
\DeclareMathOperator{\tipe}{type}
\DeclareMathOperator{\lcm}{lcm}
\DeclareMathOperator{\connect}{connec}
\DeclareMathOperator{\rnk}{rank}
\DeclareMathOperator{\truu}{true}
\DeclareMathOperator{\indx}{Index}
\DeclareMathOperator{\perm}{perm}
\DeclareMathOperator{\chek}{check}
\usepackage{authblk}
\renewcommand\Affilfont{\fontsize{8}{9.8}\itshape}
\setlength{\textwidth}{6.5in}
\setlength{\oddsidemargin}{.1in}
\setlength{\evensidemargin}{.1in}
\setlength{\topmargin}{-.1in}
\setlength{\textheight}{8.4in}
\newcommand{\seqnum}[1]{\href{http://oeis.org/#1}{\underline{#1}}}
\begin{document}
\begin{center}
\epsfxsize=4in
\leavevmode\epsffile{logo129.eps}
\end{center}
\theoremstyle{plain}
\newtheorem{theorem}{Theorem}
\newtheorem{corollary}[theorem]{Corollary}
\newtheorem{lemma}[theorem]{Lemma}
\newtheorem{proposition}[theorem]{Proposition}
\theoremstyle{definition}
\newtheorem{definition}[theorem]{Definition}
\newtheorem{example}[theorem]{Example}
\newtheorem{conjecture}[theorem]{Conjecture}
\theoremstyle{remark}
\newtheorem{remark}[theorem]{Remark}
\begin{center}
\vskip 1cm{\LARGE\bf Counting Hidden Neural Networks
}
\vskip 1cm
Anthony Richard \\
D\'epartement de Math\'ematiques et de Statistique \\
Universit\'e Laval \\
Qu\'ebec \\
Canada\\
\ \\
Patrick Desrosiers \\
Centre de Recherche de L'Institut
Universitaire en Sant\'e Mentale de Qu\'ebec \
and \\
D\'epartement de Physique, de G\'enie Physique et d'Optique \\
Universit\'e Laval \\
Qu\'ebec \\
Canada \\
\ \\
Simon Hardy \\
Centre de Recherche de L'Institut
Universitaire en Sant\'e Mentale de Qu\'ebec \\
and \\
D\'epartement de Biochimie, Microbiologie et Bio-Informatique \\
and \\
D\'epartement d'Informatique et de G\'enie Logiciel \\
Universit\'e Laval \\
Qu\'ebec \\
Canada \\
\ \\
Nicolas Doyon\\
D\'epartement de Math\'ematiques et de Statistique \\
Universit\'e Laval \\
and \\
Centre de Recherche de L'Institut Universitaire en Sant\'e Mentale de Qu\'ebec \\
Qu\'ebec \\
Canada\\
\href{mailto:nicolas.doyon@mat.ulaval.ca}{\tt nicolas.doyon@mat.ulaval.ca}
\end{center}
\pagebreak
\tikzset{->-/.style={decoration={
markings,
mark=at position #1 with {\arrow{>}}},postaction={decorate}}}
\begin{abstract}
We apply combinatorial tools, including P\'olya's theorem, to
enumerate all possible networks for which (1) the network contains
distinguishable input and output nodes as well as partially
distinguishable intermediate nodes; (2) all connections are directed
and for each pair of nodes, there are at most two connections, that is,
at most one connection per direction; (3) input nodes send connections
but don't receive any, while output nodes receive connections but don't
send any; (4) every intermediate node receives a path from an input
node and sends a path to at least one output node; and (5) input nodes don't
send direct connections to output nodes. We first obtain the generating
function for the number of such networks, and then use it to obtain
precise estimates for the number of networks. Finally, we develop a
computer algorithm that allows us to generate these networks. This work
could become useful in the field of neuroscience, in which the problem of
deciphering the structure of hidden networks is of the utmost
importance, since there are several instances in which the activity of
input and output neurons can be directly measured, while no direct
access to the intermediate network is possible. Our results can also be
used to count the number of finite automata in which each cell plays a
relevant role.
\end{abstract}
\section{Introduction}
The problem of counting networks, graphs or digraphs with a given set of conditions has a rich mathematical history that was greatly influenced by the seminal work of P\'{o}lya and Harary \cite{Polya1937,Harary1955,Harary1957}. Results in this field have many applications in scientific fields such as chemistry, resources allocation and number theory \cite{Knopfmacher2002,Adelio2002,Alon2008,Craven1980}. Several versions of the graph (or digraph) enumeration problem have been investigated, depending on whether the vertices are labelled (distinguishable) or not (indistinguishable) or whether directed, connected or simple graphs are considered \cite{Bender1983,Pittel2005,Birsdorff2008}.
In some applications, such as the description of neural networks or cellular automata and network coding theory, graphs describe the flow of information from a set of input nodes to output ones \cite{Hopfield,Ahlswede2000, Yan2012}. In such cases, the input and output nodes are often called sources and sinks, respectively. An intermediary node (neuron or cell) is relevant to the global behaviour of the system only if it is (directly or indirectly) connected to both an input and an output element. Enumeration problems of such structures have raised interest in the field of finite cellular automata \cite{Twining, Grassberger}, where the size of a given class of automata can be related to the computing power of this class \cite{Fuks, Semenovich}. Moreover,
in some instances of neuroscience, it is possible to directly measure the input and the output of a neural network without having access to direct measurement of the intermediate hidden network as in the investigation of the pain processing network of the spinal cord where \cite{Lavertu2014}. The question as to what extent it is possible to infer this structure from the input-output relation is of great importance to the field of computational neuroscience and poses fascinating problems from a mathematical point of view.
\subsection{The problem}
In the present paper, we enumerate the possible networks for a given number of input, output and intermediate nodes. Since we assume that interactions (input manipulation or data retrieval) are only possible or relevant with input and output nodes, we consider these as labelled while we treat the intermediate nodes as unlabelled. We also differentiate between two types of intermediate nodes. In the context of neural networks, this could correspond to differentiating between excitatory and inhibitory interneurons for instance. The techniques we develop could be used to treat the cases in which the intermediate nodes are subdivided into an arbitrary number of subtypes which could describe automata composed of many types of fundamental units.
Specifically, we enumerate the networks composed of nodes (neurons or cells) and links (connections) that satisfy the following conditions:
\begin{enumerate}
\item[1.1] Every node belongs to one and only one of the following types : input nodes, intermediate node of type I, intermediate node of type II, output node.
\item[1.2] All input and output nodes are distinguishable (labelled).
\item[1.3] All intermediate nodes of type I are indistinguishable and all intermediate nodes of type II are indistinguishable.
\item[2.1] All links are directed.
\item[2.2] There is at most one link from a given node to another node.
\item[3.1] All input nodes send outward links but don't receive inward links.
\item[3.2] All output nodes receive inward links but don't send outward links.
\item[3.3] There are no direct link from input nodes to output nodes.
\item[4.1] Every intermediate node receives a path from at least one input node.
\item[4.2] Every intermediate node sends at least a path to an output node.
\end{enumerate}
Recall that the adjacency matrix $A$ of a network with $N$ nodes is a square matrix containing $N\times N$ elements, $A_{i,j}$, such that $A_{i,j}=1$ if there is a link from node $i$ to node $j$ and $A_{i,j}=0$ otherwise.
Conditions $3.1$, $3.2$ and $3.3$ thus imply that the adjacency matrix $A$ has the following form:
$$ A=\begin{pmatrix}0_{m\times m}& B_{m\times n}&0_{m\times k}\\0_{n\times m} & C_{n\times n} & D_{n\times k}\\
0_{k\times m}&0_{k\times n}&0_{k\times k}\end{pmatrix},$$
where the subscripts indicate the size of the submatrices, $0_{p\times q}$ stands for a submatrix whose elements are all equal to zero, $B_{m\times n}$ and $D_{n\times k}$ are binary matrices with no special form, and $C_{n\times n}$ is a binary matrix whose diagonal elements are all equal to zero.
Conditions $4.1$ and $4.2$ are equivalent to the fact that every intermediate node is relevant in the network. Indeed, the activity of an interneuron that would not receive a path from an input neuron or that would not send a path to an output neuron could not be detected from input/output measurements. Similarly, in the context of finite automata, such a cell would not impact the function computed by the automata. Note that the indistinguishableness of the intermediate node imposes a significant constraint on our enumeration problem : permutations of the intermediate nodes of the same type are only counted once. Any network that complies with all the above conditions is characterized as admissible. Examples of admissible and non-admissible networks are given in Figure 1.
We write $N(m,n_1,n_2,k)$ as the number of admissible networks with $m$ input nodes, $n_1$ intermediate nodes of type I, $n_2$ intermediate nodes of type II and $k$ output nodes. For the sake of brevity, we also write $n=n_1+n_2$. For illustration, we display in Figure 2 all the admissible networks counted by $N(1,2,0,1)$, where the input node is on the left of each network, the intermediate nodes (in green) in the middle and the output one on the right. From this list of examples we conclude that $N(1,2,0,1)=10$. \\
\begin{figure}\label{figFirst}
\centering
\begin{tabular}{ccc}
\begin{tikzpicture}
\draw[black, thick, ->-=.5] (0,1.5) to (2,2);
\draw[black, thick, ->-=.5] (0,1.5) to (2,0);
\draw[black, thick, ->-=.5] (2,2) to (2,1);
\draw[black, thick, ->-=.5] (2,2) to (4,1);
\draw[black, thick, ->-=.5] (2,1) to (4,1);
\draw[black, thick, ->-=.7] (2,0) to [bend left] (2,1);
\draw[black, thick, ->-=.5] (0,0.5) to (2,0);
\draw[black, thick, ->-=.7] (2,1) to [bend left] (2,0);
\draw [fill=black] (0,0.5) circle (0.6 ex);
\draw [fill=black] (0,1.5) circle (0.6 ex);
\draw [fill=ForestGreen] (2,0) circle (0.6 ex);
\draw [fill=ForestGreen] (2,1) circle (0.6 ex);
\draw [fill=red] (2,2) circle (0.6 ex);
\draw [fill=blue] (4,1) circle (0.6 ex);
\end{tikzpicture}
&
\begin{tikzpicture}
\draw[black, thick, ->-=.5] (0,1.5) to (2,2);
\draw[black, thick, ->-=.5] (0,0.5) to (2,2);
\draw[black, thick, ->-=.5] (0,1.5) to (2,0);
\draw[black, thick, ->-=.5] (2,2) to (2,1);
\draw[black, thick, ->-=.5] (2,0) to (2,1);
\draw[black, thick, ->-=.5] (2,2) to (4,1);
\draw[black, thick, ->-=.7] (2,0) to (4,1);
\draw[black, thick, ->-=.7] (2,0) to [bend left] (2,2);
\draw[black, thick, ->-=.7] (2,2) to [bend left] (2,0);
\draw[black, thick, ->-=.5] (0,0.5) to (2,0);
\draw [fill=black] (0,0.5) circle (0.6 ex);
\draw [fill=black] (0,1.5) circle (0.6 ex);
\draw [fill=ForestGreen] (2,0) circle (0.6 ex);
\draw [fill=ForestGreen] (2,1) circle (0.6 ex);
\draw [fill=red] (2,2) circle (0.6 ex);
\draw [fill=blue] (4,1) circle (0.6 ex);
\end{tikzpicture}
& \begin{tikzpicture}
\draw[black, thick, ->-=.5] (0,1.5) to (2,2);
\draw[black, thick, ->-=.5] (0,0.5) to (2,2);
\draw[black, thick, ->-=.5] (0,1.5) to (2,0);
\draw[black, thick, ->-=.5] (2,2) to (4,1);
\draw[black, thick, ->-=.7] (2,1) to (4,1);
\draw[black, thick, ->-=.7] (2,0) to [bend left] (2,2);
\draw[black, thick, ->-=.7] (2,1) to [bend right] (2,2);
\draw[black, thick, ->-=.7] (2,1) to [bend left] (2,0);
\draw[black, thick, ->-=.5] (0,0.5) to (2,0);
\draw [fill=black] (0,0.5) circle (0.6 ex);
\draw [fill=black] (0,1.5) circle (0.6 ex);
\draw [fill=ForestGreen] (2,0) circle (0.6 ex);
\draw [fill=ForestGreen] (2,1) circle (0.6 ex);
\draw [fill=red] (2,2) circle (0.6 ex);
\draw [fill=blue] (4,1) circle (0.6 ex);
\end{tikzpicture}
\end{tabular}
\caption{{\footnotesize Admissible and non-admissible networks. Black, green, red, and blue represent input nodes, intermediate nodes of type I and II and output nodes respectively. Left: Admissible network. Middle: Non-admissible network. The intermediate node at the center do not send a path to an output node. Right: Non-admissible network. The intermediate node at the center do not receive any path from an input node. }}
\end{figure}
\begin{figure}\label{listNets}
\centering
\begin{tikzpicture}
%Premier Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (1.5,0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to (1.5,-0.5);
\draw[black, thick, ->-=.5] (1.5,-0.5) to (3,0);
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw [ForestGreen] (1.5,0.5) node {$\bullet$};
\draw [ForestGreen] (1.5,-0.5) node {$\bullet$};
\draw (3,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (3.5,-1) -- (3.5,1);
%Deuxieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (4,0) to (5.5,-0.5);
\draw[black, thick, ->-=.5] (4,0) to (5.5,0.5);
\draw[black, thick, ->-=.5] (5.5,-0.5) to (5.5,0.5);
\draw[black, thick, ->-=.5] (5.5,0.5) to (7,0);
%Les sommets ou neurones
\draw (4,0) node {$\bullet$};
\draw [ForestGreen] (5.5,0.5) node {$\bullet$};
\draw [ForestGreen] (5.5,-0.5) node {$\bullet$};
\draw (7,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (7.5,-1) -- (7.5,1);
%Troisieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (8,0) to (9.5,-0.5);
\draw[black, thick, ->-=.5] (9.5,-0.5) to (9.5,0.5);
\draw[black, thick, ->-=.5] (9.5,0.5) to (11,0);
\draw[black, thick, ->-=.5] (9.5,-0.5) to (11,0);
%Les sommets ou neurones
\draw (8,0) node {$\bullet$};
\draw [ForestGreen] (9.5,0.5) node {$\bullet$};
\draw [ForestGreen] (9.5,-0.5) node {$\bullet$};
\draw (11,0) node {$\bullet$};
\end{tikzpicture}
\begin{tikzpicture}
\draw [-,gray,thick] (-1,1) -- (12,1);
\end{tikzpicture}
\begin{tikzpicture}
%Quatrieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (1.5,0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to (3,0);
\draw[black, thick, ->-=.5] (1.5,-0.5) to [bend left] (1.5,0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to [bend left] (1.5,-0.5);
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw [ForestGreen] (1.5,0.5) node {$\bullet$};
\draw [ForestGreen] (1.5,-0.5) node {$\bullet$};
\draw (3,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (3.5,-1) -- (3.5,1);
%Cinquieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (4,0) to (5.5,-0.5);
\draw[black, thick, ->-=.5] (4,0) to (5.5,0.5);
\draw[black, thick, ->-=.5] (5.5,0.5) to (7,0);
\draw[black, thick, ->-=.5] (5.5,-0.5) to (7,0);
%Les sommets ou neurones
\draw (4,0) node {$\bullet$};
\draw [ForestGreen] (5.5,0.5) node {$\bullet$};
\draw [ForestGreen] (5.5,-0.5) node {$\bullet$};
\draw (7,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (7.5,-1) -- (7.5,1);
%Sixieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (8,0) to (9.5,-0.5);
\draw[black, thick, ->-=.5] (9.5,0.5) to (11,0);
\draw[black, thick, ->-=.5] (9.5,-0.5) to [bend left] (9.5,0.5);
\draw[black, thick, ->-=.5] (9.5,0.5) to [bend left] (9.5,-0.5);
%Les sommets ou neurones
\draw (8,0) node {$\bullet$};
\draw [ForestGreen] (9.5,0.5) node {$\bullet$};
\draw [ForestGreen] (9.5,-0.5) node {$\bullet$};
\draw (11,0) node {$\bullet$};
\end{tikzpicture}
\begin{tikzpicture}
\draw [-,gray,thick] (-1,1) -- (12,1);
\end{tikzpicture}
\begin{tikzpicture}
%Septieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (1.5,0.5);
\draw[black, thick, ->-=.5] (0,0) to (1.5,-0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to (3,0);
\draw[black, thick, ->-=.5] (1.5,-0.5) to [bend left] (1.5,0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to [bend left] (1.5,-0.5);
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw [ForestGreen] (1.5,0.5) node {$\bullet$};
\draw [ForestGreen] (1.5,-0.5) node {$\bullet$};
\draw (3,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (3.5,-1) -- (3.5,1);
%Huitieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (4,0) to (5.5,-0.5);
\draw[black, thick, ->-=.5] (4,0) to (5.5,0.5);
\draw[black, thick, ->-=.5] (5.5,0.5) to (5.5,-0.5);
\draw[black, thick, ->-=.5] (5.5,0.5) to (7,0);
\draw[black, thick, ->-=.5] (5.5,-0.5) to (7,0);
%Les sommets ou neurones
\draw (4,0) node {$\bullet$};
\draw [ForestGreen] (5.5,0.5) node {$\bullet$};
\draw [ForestGreen] (5.5,-0.5) node {$\bullet$};
\draw (7,0) node {$\bullet$};
%Séparateur
\draw [-,gray,thick] (7.5,-1) -- (7.5,1);
%Neuvieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (8,0) to (9.5,-0.5);
\draw[black, thick, ->-=.5] (9.5,0.5) to (11,0);
\draw[black, thick, ->-=.5] (9.5,-0.5) to (11,0);
\draw[black, thick, ->-=.5] (9.5,-0.5) to [bend left] (9.5,0.5);
\draw[black, thick, ->-=.5] (9.5,0.5) to [bend left] (9.5,-0.5);
%Les sommets ou neurones
\draw (8,0) node {$\bullet$};
\draw [ForestGreen] (9.5,0.5) node {$\bullet$};
\draw [ForestGreen] (9.5,-0.5) node {$\bullet$};
\draw (11,0) node {$\bullet$};
\end{tikzpicture}
\begin{tikzpicture}
\draw [-,gray,thick] (-1,1) -- (12,1);
\end{tikzpicture}
\begin{tikzpicture}
%Dixieme Graphe
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (1.5,0.5);
\draw[black, thick, ->-=.5] (0,0) to (1.5,-0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to (3,0);
\draw[black, thick, ->-=.5] (1.5,-0.5) to (3,0);
\draw[black, thick, ->-=.5] (1.5,-0.5) to [bend left] (1.5,0.5);
\draw[black, thick, ->-=.5] (1.5,0.5) to [bend left] (1.5,-0.5);
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw [ForestGreen] (1.5,0.5) node {$\bullet$};
\draw [ForestGreen] (1.5,-0.5) node {$\bullet$};
\draw (3,0) node {$\bullet$};
\end{tikzpicture}
\caption{{\footnotesize All admissible and non-equivalent networks with one input node (left), two intermediate node of type I (green) and one output node (right).}}
\end{figure}
\subsection{Results summary}
We set $N(m,n_1,n_2,k;x)$ as the generating function of the number of admissible networks networks counted by $N(m,n_1,n_2,k)$. Explicitly,
$$
N(m,n_1,n_2,k;x)=\sum_j c_j x^j
$$
where $c_j$ denotes a function of $m,n_1,n_2$, and $k$ that counts the number of admissible networks with exactly $j$ connections. We thus have
$$ N(m,n_1,n_2,k)=\sum_jc_j. $$
For instance, from the case illustrated in Figure 2, we infer that if $m=1,n_1=2,n_2=0$, and $k=1$, then
$$ c_1=0,\quad c_2=0,\quad c_3=1,\quad c_4=5,\quad c_5=3,\quad c_6=1,\quad c_j=0 \quad (j\ge 7).$$
We need some more notation. We define $N^*(m,n_1,n_2,k)$ as the number of networks satisfying all conditions, except possibly conditions 4.1 and 4.2. We will refer to this quantity as the total number of networks. We also write
$$
N^*(m,n_1,n_2,k;x)=\sum_j c^*_jx^j
$$
where $c_j^*$ counts the number of such graphs having exactly $j$ connections. Note that $c^*_j\geq c_j$ for all $j$. We write $n$ for the total number of intermediate nodes, i.e., $n=n_1+n_2$. For a nonnegative integer $n$, we write $\lambda\vdash n$ if $\lambda$ is an integer partition of $n$ and we write $\lambda=(1^{a_1},2^{a_2},\ldots,n^{a_n})$, where $a_j\ge 0$ stand for the number of occurrences of the integer $j$ in the partition $\lambda$. Moreover, we write $\lambda_1\cup\lambda_2$ for the union of the partitions $\lambda_1$ and $\lambda_2$ such that if $\lambda_1=(1^{a_1},2^{a_2},\ldots,n^{a_n})$ and $\lambda_2=(1^{b_1}, 2^{b_2},\ldots,n^{b_n})$ then $\lambda_1\cup\lambda_2=(1^{a_1+b_1},2^{a_2+b_2},\ldots,n^{a_n+b_n})$.
\begin{theorem} \label{theo1}The generating function for the total number of networks is
$$
N^*(m,n_1,n_2,k;x)=\sum_{\lambda_1\vdash n_1\atop{\lambda_2\vdash n_2}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}H(\lambda_1\cup\lambda_2,m,k;x)
$$
{\it where}
$$
\omega(\lambda)=n!\prod_{1\le j\le n}\frac{1}{j^{a_j}a_j!}
$$
{\it and}
$$
H(\lambda,m,k;x)=\prod_{a_j>0}\left(1+x^j\right)^{a_j(k+m+j-1)+j(a_j-1)}\prod_{j\ge 1\atop{\ell>j}}
\left(1+x^{\lcm(j,\ell)}\right)^{2j\ell a_ja_\ell/\lcm(j,\ell)}.
$$
\end{theorem}
In Theorem \ref{theoXX}, we obtain a recursive formula for $N(m,n_1,n_2,k;x)$ which we implemented in Python to obtain a list of values of $N(m,n_1,n_2,k)$ given in Table 1. This yields in particular the following striking example:
$$
N(5,5,5,5)=108844524790336539487420588884391944954279893619583184.
$$
We also obtain accurate estimates for both $N^*(m,n_1,n_2,k)$ and $N(m,n_1,n_2,k)$.
\begin{theorem} \label{theo3}
\begin{eqnarray*}
N^*(m,n_1,n_2,k)&=&\frac{2^{(n+m+k-1)n}}{n_1!n_2!}\Bigg(1+\left({n_1\choose 2}+{n_2\choose 2}\right)2^{-2n-m-k+3}\\
& &+\left(2{n_1\choose 3}+2{n_2\choose 3}\right)2^{-4n-2m-2k+8}\\
& &+\left({n_1 \choose 2}{n_2\choose 2}+3{n_1\choose 4}+3{n_2\choose 4}\right)2^{-4n-2m-2k+10}\Bigg)\\
& &+O\left(\frac{2^{(n+m+k-1)n}}{n_1!n_2!}n^72^{-6n-3m-3k}\right).
\end{eqnarray*}
\end{theorem}
\begin{theorem}\label{theo4}
\begin{eqnarray*}
N(m,n_1,n_2,k)&=& \frac{1}{n_1!n_2!}2^{(n+m+k-1)n}-\frac{n}{n_1!n_2!}2^{(m+n+k-1)n-n-m+1}\\
& &+\frac{n}{n_1!n_2!}2^{(m+n+k-1)n-n-k+1}+O\left(\frac{2^{(n+m+k-1)n}n^3}{n_1!n_2!}\left(2^{-2n-\min(m,k)}\right)\right).
\end{eqnarray*}
\end{theorem}
The following asymptotic formulas immediately follow from Theorem \ref{theo3} and Theorem \ref{theo4}.
\begin{corollary} \label{coro1}
As $n\to \infty$,
$$
N(m,n_1,n_2,k)\sim N^*(m,n_1,n_2,k)\sim \frac{2^{(m+n+k-1)n}}{n_1!n_2!}
$$
and
$$
N^*(m,n_1,n_2,k)-N(m,n_1,n_2,k)\sim\frac{n\left(2^{(m+n+k-1)n-m-n+1}+2^{(m+n+k-1)n-k-n+1}\right)}{n_1!n_2!}.
$$
\end{corollary}
In Section \ref{numerics}, we also provide an algorithm that can generate, for a given tuple $(m,n_1,n_2,k)$, all connectivity matrices
associated with the admissible networks.
\section{Notation and preliminary results}
\subsection{Notation and the P\'{o}lya enumeration}
Following the standard notation, we let $S_n$ stand for the set of permutations of $1,2,\ldots, n$. As in \cite{Andrews1988}, we write $\lambda\vdash n$ if $\lambda$ is an integer partition of $n$. We denote partitions by the Greek letters $\lambda$ or $\mu$ and use the frequency representation $\lambda=(1^{a_1},2^{a_2},\ldots)$ to mean that the integer $j$ appears $a_j$ times in the partition $\lambda$.
For a given permutation $s\in S_n$
and a partition $\lambda\vdash n$,
we say that $s$ is of type $\lambda$
($\tipe(s)=\lambda=(1^{a_1},2^{a_2},\ldots)$) if $s$ has precisely $a_j$ disjoint cycles of length $j$. For example, if $n=4$ and $s=(2,1,3,4)$ then $s$ can be written as the following product of disjoint cycles $(1,2)(3)(4)$ so $\tipe(s)=(1^2,2^1)$. For a collection of partitions $\lambda_j$, we write $a^{j}$ for the frequency vector associated with the permutation $\lambda_j$. We also let $\phi$ stand for an empty partition.
The P\'{o}lya enumeration theorem gives the generating function
associated with the counting of graphs satisfying certain conditions by considering the orbits induced on the connections of the graph by the permutations of its nodes. If $H(x)$ is the generating function
associated with the counting of graphs satisfying certain conditions (where the coefficient of $x^n$ stands for the number of such graphs with $n$ connections) and if ${\cal G}$ stands the group of permitted node permutations, we have
\begin{equation}
H(x)=\frac{1}{\#{\cal G}}\sum_{s\in{\cal G}}\prod_{j=1}^{\infty}(1+x^j)^{\alpha (j,s)}
\end{equation}
where $\alpha(j,s)$ is the number of disjoint orbits of length $j$ induced on the connections of the graphs by the permutations of nodes $s$. The expression $$\prod_{j=1}^{\infty}(1+x^j)^{\alpha (j,s)}$$ can be thought as the generating function of {\it labelled} graphs left invariant by the permutation $s$. For example, if we consider the networks with 1 input node, 2 intermediate nodes of type 1 and 1 output node, the only admissible permutations are these of the two indistinguishable intermediate nodes. The admissible permutations are thus $s_1=(1,2)$ with $\tipe(s_1)=(1^2)$ and $s_2=(2,1)$ with $\tipe(s_2)=(2^1)$. There are six possible connections represented in the figure below. Given that $s_1$ leaves all six connections fixed (induces six disjoint orbits of length one), the term of the generating function associated with $s_1$ is $(1+x)^6$. The permutation $s_2$ induces 3 orbits of length 2 on the connections of the network so the term of the generating function associated with $s_2$ is $(1+x^2)^3$. This yields
$$
N(1,2,0,1;x)=\frac{1}{2}\left((1+x)^6+(1+x^2)^3\right).
$$
\begin{center}
\begin{tikzpicture}
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw (3,1) node {$\bullet$};
\draw (3,-1) node {$\bullet$};
\draw (6,0) node {$\bullet$};
%Nomination des neurones
\draw (0,0) node [left] {$k$};
\draw (3,1) node [above] {$n_{1,1}$};
\draw (3,-1) node [below] {$n_{1,2}$};
\draw (6,0) node [right] {$m$};
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (3,1);
\draw[black, thick, ->-=.5] (0,0) to (3,-1);
\draw[black, thick, ->-=.5] (3,1) to (6,0);
\draw[black, thick, ->-=.5] (3,-1) to (6,0);
\draw[black, thick, ->-=.5] (3,-1) to [bend right] (3,1);
\draw[black, thick, ->-=.5] (3,1) to [bend right] (3,-1);
%Nomination des arêtes
\draw (1.5,0.5) node [above left, blue] {$a$};
\draw (1.5,-0.5) node [below left, blue] {$b$};
\draw (2.7,0) node [left, blue] {$c$};
\draw (3.3,0) node [right, blue] {$d$};
\draw (4.5,0.5) node [above right, blue] {$e$};
\draw (4.5,-0.5) node [below right, blue] {$f$};
\end{tikzpicture}\\
Orbits induced by $s_2$
\end{center}
\begin{align*}
\textcolor{blue}{a}\longrightarrow \textcolor{blue}{b}\longrightarrow \textcolor{blue}{a},
\quad \textcolor{blue}{c}\longrightarrow \textcolor{blue}{d}\longrightarrow \textcolor{blue}{c},
\quad \textcolor{blue}{e}\longrightarrow \textcolor{blue}{f}\longrightarrow \textcolor{blue}{e}.
\end{align*}
In many cases, as in the present paper, the values of $\alpha(j,s)$ depend only on the permutation type $\lambda=\tipe(s)$ so we can write
\begin{equation}
H(x)=\frac{1}{\#{\cal G}}\sum_{\lambda\vdash n}\#\{s\in {\cal G}:\tipe(s)=\lambda\}\prod_{j=1}^{\infty}(1+x^j)^{\alpha (j,\lambda)}.
\end{equation}
In order to compute $\#\{s\in {\cal G}:\tipe(s)=\lambda\}$, we will make use of the following result:
\begin{lemma}
For a positive integer $n$ and a partition type $\lambda\vdash n$, if we define
$$
\omega(\lambda):=\#\{s\in S_n: \tipe(s)=\lambda\}
$$
then
$$
\omega(\lambda)=n!\prod_{1\le j\le n}\frac{1}{j^{a_j}a_j!}.
$$
\end{lemma}
For example, if $n=4$, the number of permutations of the type
$(1^2,2^1)$ is
$$
\frac{4!}{1^2 2! 2^1 1!}=6.
$$
\begin{proof}
An elementary proof of this classical result can be found in \cite{Harary1973}.
\end{proof}
For two permutations $s_1$ and $s_2$ acting on disjoint sets of nodes, we write $s_1\cdot s_2$ as the usual permutation product. Provided that
$\tipe(s_1)=\lambda_1$ and that $\tipe(s_2)=\lambda_2$, we also define
\begin{equation}
\lambda_1\cup \lambda_2=\tipe(s_1\cdot s_2).
\end{equation}
This is well defined since $\lambda_1\cup \lambda_2$ doesn't depend on the particular choices of $s_1$ and $s_2$. For example, if $\lambda_1=(1^1,2^1,3^2)$ and $\lambda_2=(1^1,4^1)$, we have $\lambda_1\cup \lambda_2=(1^2,2,3^2,4)$. For an integer $N\ge 2$, we generalize this notation to
$$
\lambda_1\cup \lambda_2\cup\cdots \cup \lambda_{N+1}=\left(\lambda_1\cup \lambda_2\cup\cdots \cup \lambda_{N}\right)\cup \lambda_{N+1}.
$$
For two partitions $\lambda_1, \lambda_2$ and any permutation $s$ such that $\tipe(s)=\lambda_1\cup \lambda_2$ we define
$$
\Omega(\lambda_1,\lambda_2):=\#\{(s_1,s_2):\tipe(s_1)=\lambda_1,\tipe(s_2)=\lambda_2, s_1\cdot s_2=s\}.
$$
For example, suppose that $\lambda_1=(1,2)$ and $\lambda_2=(1)$ so that $\lambda=(1^2,2)$. A canonical choice of $s$ (written as a product of disjoint cycles) is $s=(1)(2)(3,4)$. The possible choices for the pair $(s_1,s_2)$ are $s_1=(1)(3 4)$ and $s_2=(2)$ as well as $s_1=(2)(3 4)$ and $s_2=(1)$. We thus have $\Omega(\lambda_1,\lambda_2)=2$. For partitions $\lambda_1,\lambda_2,\ldots, \lambda_N$ we recursively extend the definition of $\Omega$ to
\begin{eqnarray*}
& &\Omega(\lambda_1,\lambda_2,\ldots,\lambda_N)\\
&=&\Omega(\lambda_1\cup\cdots\cup \lambda_{N-1}, \lambda_N)\Omega(\lambda_1\cup\cdots\cup \lambda_{N-2}, \lambda_{N-1}).
\end{eqnarray*}
We can obtain the following explicit expression for $\Omega(\lambda_1,\ldots, \lambda_N)$
\begin{lemma}\label{lem2}
$$
\Omega(\lambda_1,\lambda_2,\ldots,\lambda_N)=\prod_{j}a_j!\prod_{y=1}^N\frac{1}{a_j^{y}!}
$$
where $a$ is the frequency vector of $\lambda_1\cup\lambda_1\cup\cdots\cup\lambda_N$ and the $a^{y}$s are the frequency vectors of $\lambda_y$.
\end{lemma}
\begin{proof}
Lemma \ref{lem2} follows directly from the observation that the $a_j$ cycles of length $j$ have to be distributed between $N$ permutations such that the $y$th permutation receives exactly $a^{y}_j$ of these cycles. The number of ways to achieve this is exactly
$$
a_j!\prod_{y=1}^N\frac{1}{a^{y}_j!}.
$$
\end{proof}
\begin{remark} It is implicit in the definition of $\Omega$ that the value of $\Omega(\lambda_1\cup \lambda_2)$ doesn't depend on the particular choice of the permutation $s$ of type $\lambda_1\cup\lambda_2$ which indeed follows from Lemma \ref{lem2} and its proof.
\end{remark}
\subsection{Auxiliary functions}
We derive the components of the generating function $N(m,n_1,n_2,k;x)$
associated with the number of networks counted by $N(m,n_1,n_2,k)$. We let $F(\lambda;x)$ stand for the action of a permutation $s$ of type $\lambda$ on connections between two nodes permuted by $s$. In other words, for a permutation $s$ of type $\lambda$, $F(\lambda;x)$ is the generating function of {\it labelled} graphs left invariant by this permutation. We have
\begin{lemma}
\begin{eqnarray*}
F(\lambda;x)&=&\prod_{a_j>0}\left(1+x^j\right)^{a_j(j-1)+ja_j(a_j-1)}\prod_{j\ge 1}\prod_{y>j}\left(1+x^{\lcm(j,y)}\right)^{2jya_ja_y/\lcm(j,y)}.
\end{eqnarray*}
\end{lemma}
\begin{proof}
The expression
$$
\prod_{a_j>0}\left(1+x^j\right)^{a_j (j-1)}
$$
corresponds to the action of a permutation of type $\lambda$ on edges between two nodes in the same cycle. Indeed, since we are considering oriented connections, the length of the orbit of the connections induced by a node cycle of length $j$ is also $j$. Given that there are $j(j-1)$ possible connections between nodes permuted by the same cycle, the number of orbits is $j(j-1)/j=j-1$. Since there are $a_j$ ways of choosing a cycle of length $j$, the exponent must be multiplied by $a_j$.
On the other hand, the expression
$$
\prod_{a_j>0}\left(1+x^j\right)^{ja_j(a_j-1)}
$$
corresponds to the action of a permutation of type $\lambda$ on connections between nodes permuted by different cycles of the same length. Indeed, the length of the orbits induced on the connections is the same as the length of the node cycle: $j$. For two disjoint cycles of length $j$, there are $2 j^2$ possible connections between a node of the first cycle and one of the second. The number of disjoint connection orbits is thus $2 j^2/j=2j$. Furthermore the number of ways of choosing two node cycles of length $j$ is ${a_j\choose 2}$ so the exponent is equal to
$$
\frac{a_j(a_j-1)}{2}2j=ja_j(a_j-1).
$$
Finally, the expression
$$
\prod_{j\ge 1}\prod_{y>j}\left(1+x^{\lcm(j,y)}\right)^{2jya_ja_y/\lcm(j,y)}
$$
corresponds to the action of a permutation of type $\lambda$ on connections between two nodes belonging to different cycles of different lengths. Indeed, on connections between a node belonging to a cycle of length $j$ and a node belonging to a cycle of length $y$, the node permutation will induce orbits of length
$$
\lcm(j,y).
$$
Given that the number of such possible connections is $2jy$, the number of such orbits is equal to
$$
2jy/\lcm(j,y).
$$
Moreover, the number of ways of choosing a node cycle of length $j$ and a node cycle of length $y$ is $a_ja_y$ so the exponent is
$$
2jy a_ja_y/\lcm(j,y).
$$
\end{proof}
If $s_1$ and $s_2$ are permutations respectively of type $\lambda_1$ and $\lambda_2$ acting on disjoint sets of nodes, we define $D(\lambda_1|\lambda_2;x)$ as the generating function counting the number of {\it labelled} graphs left invariant by $s_1\cdot s_2$ under the following restriction: Only connections from nodes permuted by $s_1$ to nodes permuted by $s_2$ are permitted. Observe that by symmetry, it follows from this definition that $D(\lambda_1|\lambda_2;x)=D(\lambda_2|\lambda_1;x)$. We will use the following result.
\begin{lemma}\label{lem4}
\begin{equation*}
D(\lambda_1|\lambda_2;x)=\prod_{j\ge 1}\prod_{y\ge 1}\left(1+x^{\lcm(j,y)}\right)^{jya_j^{1}a_y^{2}/\lcm(j,y)}.
\end{equation*}
\end{lemma}
\begin{proof}
Cycles of length $j$ and $y$ will induce orbits of length $\lcm(j,y)$ on connections from nodes permuted by the cycle of length $j$ to nodes permuted by the cycle of length $y$. Given that we only allow connections in one direction, the total number of possible connections is $jy$. It follows that the number of disjoint orbits is $jy/\lcm(j,y)$. Furthermore, the number of ways of choosing a node cycle of length $j$ in a permutation of type $\lambda_1$ and a node cycle of length $y$ in a permutation of type $\lambda_2$ is $a_j^{1}a_y^{2}$ from which Lemma \ref{lem4} follows.
\end{proof}
If $s_1$ and $s_2$ (respectively of type $\lambda_1$ and $\lambda_2$) are two permutations acting on disjoint sets of nodes, we define $I(\lambda_1|\lambda_2;x)$ as the generating function of the number of {\it labelled} networks left invariant by $s_1\cdot s_2$ under the following restrictions: 1. Only connections from nodes permuted by $s_1$ to nodes permuted by $s_2$ are permitted. 2. Each node permuted by $s_1$ must send at least one connection to a node permuted by $s_2$. We have the following explicit expression for $I(\lambda_1|\lambda_2;x)$.
\begin{lemma}\label{lem5}
\begin{equation*}
I(\lambda_1|\lambda_2;x)=\prod_{a_j^{1}>0}\left(\left(\prod_{a_y^{2}>0}\left(1+x^{\lcm(j,y)}\right)^{jya_y^{2}/\lcm(j,y)}\right)-1\right)^{a_j^{1}}.
\end{equation*}
\end{lemma}
\begin{proof}
Let $c_j$ be a node cycle of length $j$ of $s_1$ and let $s_2$ be a permutation of type $\lambda_2$ acting on a disjoint set of nodes. The action of $s_2\cdot c_j $ on connections to nodes permuted by $s_2$ from nodes permuted by $c_j$ is given by
\begin{equation}\label{eq:I}
\prod_{a_y^2>0}\left(1+x^{\lcm(j,y)}\right)^{jya_y^{2}/\lcm(j,y)}.
\end{equation}
Since we impose that at least one connection actually exists, we must remove from ($\ref{eq:I}$) the instance of no connection which corresponds to 1 in the generating function. We thus obtain $I(\lambda_1|\lambda_2;x)$ by performing the product
$$
\prod_{a_y>0}\left(1+x^{\lcm(j,y)}\right)^{jya_y^{2}/\lcm(j,y)}-1
$$
over all cycles of a permutation $s_1$ of type $\lambda_1$ which proves Lemma \ref{lem5}.
\end{proof}
Observe that by symmetry $I(\lambda_1|\lambda_2;x)$ is also the generating function associated with the number of networks such that: 1. Only connections to nodes permuted by $s_1$ from nodes permuted by $s_2$ are permitted. 2. Each node permuted by $s_1$ must receive at least one connection from a node permuted by $s_2$.
For illustration, if $\lambda_1=(2^1)$ and $\lambda_2=(2^1)$, then
$$
I(\lambda_1|\lambda_2;x)=\left(1-x^2\right)^2-1=2x^2+x^4
$$
and the corresponding networks are:
\begin{center}
\begin{tikzpicture}
%Les sommets ou neurones
\draw (0,0) node {$\bullet$};
\draw (1,0) node {$\bullet$};
\draw (0,1) node {$\bullet$};
\draw (1,1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (0,0) to (0,1);
\draw[black, thick, ->-=.5] (1,0) to (1,1);
\draw[-, black, thick, ] (1.5,-0.5) to (1.5,1.5);
Les sommets ou neurones
\draw (2,0) node {$\bullet$};
\draw (3,0) node {$\bullet$};
\draw (2,1) node {$\bullet$};
\draw (3,1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (2,0) to [bend right](3,1);
\draw[black, thick, ->-=.5] (3,0) to[bend left] (2,1);
\draw[-, black, thick, ] (3.5,-0.5) to (3.5,1.5);
Les sommets ou neurones
\draw (4,0) node {$\bullet$};
\draw (5,0) node {$\bullet$};
\draw (4,1) node {$\bullet$};
\draw (5,1) node {$\bullet$};
\draw[black, thick, ->-=.5] (4,0) to (4,1);
\draw[black, thick, ->-=.5] (5,0) to (5,1);
\draw[black, thick, ->-=.5] (4,0) to [bend right](5,1);
\draw[black, thick, ->-=.5] (5,0) to[bend left] (4,1);
\end{tikzpicture}\\
\end{center}
For a permutation type $s_1$ of type $\lambda_1$ and a permutation $s_2$ of type $\lambda_2$ acting on disjoint sets of nodes, we define $T(\lambda_1|\lambda_2;x)$ as the generating function counting the number of {\it labelled} networks left invariant by $s_1\cdot s_2$ and satisfying the following properties: 1. The only permitted connections are these between two nodes permuted by $s_1$ or from a node permuted by $s_1$ to a node permuted by $s_2$. 2. Every node permuted by $s_1$ sends a path to at least a node permuted by $s_2$. We obtain the following recursive relation for $T(\lambda_1|\lambda_2;x)$
\begin{lemma}\label{lem6}
The following relation holds:
\begin{equation}
T(\lambda_1|\lambda_2;x)=I(\lambda_1|\lambda_2;x)F(\lambda_1;x)+\sum_{\mu_1\cup \mu_2=\lambda_1\atop{\mu_1\not=\phi,\mu_2\not=\phi}}\Omega(\mu_1,\mu_2)I(\mu_1|\lambda_2;x)F(\mu_1;x)D(\mu_1|\mu_2;x)T(\mu_2|\mu_1;x),
\nonumber
\end{equation}
which allows us to compute $T(\lambda_1|\lambda_2;x)$ recursively.
\end{lemma}
\begin{proof}
The term $I(\lambda_1|\lambda_2;x)F(\lambda_1;x)$ corresponds to the situation in which all nodes permuted by $s_1$ (of type $\lambda_1$) send a direct connection to at least a node permuted by $s_2$ (of type $\lambda_2$). Otherwise, we write $s_1=\ell_1\cdot\ell_2$ with $\tipe(\ell_1)=\mu_1$ and $\tipe(\ell_2)=\mu_2$ where $\ell_1$ permutes the nodes sending a {\it direct} connection to at least a node permuted by $s_2$. The factor $T(\mu_2|\mu_1;x)$ then imposes that every node permuted by $\ell_2$ sends a path to a node permuted by $s_2$ through a node permuted by $\ell_1$. The factor $D(\mu_1|\mu_2;x)$ accounts for the possible connections from nodes permuted by $\ell_1$ to the nodes permuted by $\ell_2$.
\end{proof}
For illustration, we assume that $\lambda_1=(1^1,2^1)$ and that $\lambda_2=(1^1)$. We then have
\begin{eqnarray*}
T(\lambda_1|\lambda_2;x)&=&I\left((1^1,2^1)|(1^1);x\right)F((1^1,2^1);x)\\
& &+I\left((2^1)|(1^1);x\right)F\left((2^1);x\right)D\left((2^1)|(1^1);x\right)T\left((1^1)|(2^1);x\right)\\
& &+I\left((1^1)|(1^1);x\right)F\left((1^1);x\right)D\left((1^1)|(2^1);x\right)T\left((2^1)|(1^1);x\right).
\end{eqnarray*}
Using $T\left((1^1)|(2^1);x\right)=I\left((1^1)|(2^1);x\right)F\left((1^1);x\right)$ and $T\left((2^1)|(1^1);x\right)=I\left((2^1)|(1^1);x\right)F\left((2^1);x\right)$ we obtain
\begin{eqnarray*}
T(\lambda_1|\lambda_2;x)&=&x^3\left(1+x^2\right)^3\\
& &+x^2\left(1+x^2\right)(1+x^2)x^2+x(1+x^2)x^2\left(1+x^2\right)\\
&=&x^9+x^8+4x^7+2x^6+5x^5+x^4+2x^3.
\end{eqnarray*}
We draw the corresponding networks under the convention that the node on which $s_2$ acts is in black while the nodes on the 2-cycle of $s_1$ are in green and the node left fixed by $s_1$ is in blue.
\begin{center}
\begin{tikzpicture}
%Les sommets ou neurones
\draw (1,0) node {$\bullet$};
\draw [ForestGreen](0,1) node {$\bullet$};
\draw [ForestGreen](1,1) node {$\bullet$};
\draw [Blue](2,1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (0,1) to (1,0);
\draw[black, thick, ->-=.5] (1,1) to (1,0);
\draw[black, thick, ->-=.5] (2,1) to (1,0);
Les sommets ou neurones
\draw (4,0) node {$\bullet$};
\draw [ForestGreen](3,1) node {$\bullet$};
\draw [ForestGreen](4,1) node {$\bullet$};
\draw [Blue](5,1) node {$\bullet$};
\draw[black, thick, ->-=.5] (3,1) to (4,0);
\draw[black, thick, ->-=.5] (4,1) to (4,0);
\draw[black, thick, ->-=.5] (5,1) to (4,0);
\draw[black, thick, ->-=.5] (3,1) to [bend right](4,1);
\draw[black, thick, ->-=.5] (4,1) to [bend right](3,1);
Les sommets ou neurones
\draw (7,0) node {$\bullet$};
\draw [ForestGreen](6,1) node {$\bullet$};
\draw [ForestGreen](7,1) node {$\bullet$};
\draw [Blue](8,1) node {$\bullet$};
\draw[black, thick, ->-=.5] (6,1) to (7,0);
\draw[black, thick, ->-=.5] (7,1) to (7,0);
\draw[black, thick, ->-=.5] (8,1) to (7,0);
\draw[black, thick, ->-=.5] (8,1) to [bend left](7,1);
\draw[black, thick, ->-=.5] (8,1) to [bend right](6,1);
Les sommets ou neurones
\draw (10,0) node {$\bullet$};
\draw [ForestGreen](9,1) node {$\bullet$};
\draw [ForestGreen](10,1) node {$\bullet$};
\draw [Blue](11,1) node {$\bullet$};
\draw[black, thick, ->-=.5] (9,1) to (10,0);
\draw[black, thick, ->-=.5] (10,1) to (10,0);
\draw[black, thick, ->-=.5] (11,1) to (10,0);
\draw[black, thick, ->-=.5] (9,1) to [bend left](11,1);
\draw[black, thick, ->-=.5] (10,1) to [bend right](11,1);
%Les sommets ou neurones
\draw (1,-2) node {$\bullet$};
\draw [ForestGreen](0,-1) node {$\bullet$};
\draw [ForestGreen](1,-1) node {$\bullet$};
\draw [Blue](2,-1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (0,-1) to (1,-2);
\draw[black, thick, ->-=.5] (1,-1) to (1,-2);
\draw[black, thick, ->-=.5] (2,-1) to (1,-2);
\draw[black, thick, ->-=.5] (0,-1) to [bend left](2,-1);
\draw[black, thick, ->-=.5] (1,-1) to [bend right](2,-1);
\draw[black, thick, ->-=.5] (0,-1) to [bend left](1,-1);
\draw[black, thick, ->-=.5] (1,-1) to [bend left](0,-1);
%Les sommets ou neurones
\draw (4,-2) node {$\bullet$};
\draw [ForestGreen](3,-1) node {$\bullet$};
\draw [ForestGreen](4,-1) node {$\bullet$};
\draw [Blue](5,-1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (3,-1) to (4,-2);
\draw[black, thick, ->-=.5] (4,-1) to (4,-2);
\draw[black, thick, ->-=.5] (5,-1) to (4,-2);
\draw[black, thick, ->-=.5] (5,-1) to [bend right](3,-1);
\draw[black, thick, ->-=.5] (5,-1) to [bend left](4,-1);
\draw[black, thick, ->-=.5] (3,-1) to [bend left](4,-1);
\draw[black, thick, ->-=.5] (4,-1) to [bend left](3,-1);
%Les sommets ou neurones
\draw (7,-2) node {$\bullet$};
\draw [ForestGreen](6,-1) node {$\bullet$};
\draw [ForestGreen](7,-1) node {$\bullet$};
\draw [Blue](8,-1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (6,-1) to (7,-2);
\draw[black, thick, ->-=.5] (7,-1) to (7,-2);
\draw[black, thick, ->-=.5] (8,-1) to (7,-2);
\draw[black, thick, ->-=.5] (8,-1) to [bend right](6,-1);
\draw[black, thick, ->-=.5] (8,-1) to [bend left](7,-1);
\draw[black, thick, ->-=.5] (6,-1) to [bend right](8,-1);
\draw[black, thick, ->-=.5] (7,-1) to [bend left](8,-1);
%Les sommets ou neurones
\draw (10,-2) node {$\bullet$};
\draw [ForestGreen](9,-1) node {$\bullet$};
\draw [ForestGreen](10,-1) node {$\bullet$};
\draw [Blue](11,-1) node {$\bullet$};
%Les arêtes
\draw[black, thick, ->-=.5] (9,-1) to (10,-2);
\draw[black, thick, ->-=.5] (10,-1) to (10,-2);
\draw[black, thick, ->-=.5] (11,-1) to (10,-2);
\draw[black, thick, ->-=.5] (11,-1) to [bend right](9,-1);
\draw[black, thick, ->-=.5] (11,-1) to [bend left](10,-1);
\draw[black, thick, ->-=.5] (9,-1) to [bend right](11,-1);
\draw[black, thick, ->-=.5] (10,-1) to [bend left](11,-1);
\draw[black, thick, ->-=.5] (9,-1) to [bend right](10,-1);
\draw[black, thick, ->-=.5] (10,-1) to [bend right](9,-1);
%Les sommets ou neurones
\draw (1,-4) node {$\bullet$};
\draw [ForestGreen](0.5,-3.5) node {$\bullet$};
\draw [ForestGreen](1.5,-3.5) node {$\bullet$};
\draw [Blue](1,-3) node {$\bullet$};
\draw[black, thick, ->-=.5] (1,-3) to (0.5,-3.5);
\draw[black, thick, ->-=.5] (1,-3) to (1.5,-3.5);
\draw[black, thick, ->-=.5] (0.5,-3.5) to (1,-4);
\draw[black, thick, ->-=.5] (1.5,-3.5) to (1,-4);
\draw (4,-4) node {$\bullet$};
\draw [ForestGreen](3.5,-3.5) node {$\bullet$};
\draw [ForestGreen](4.5,-3.5) node {$\bullet$};
\draw [Blue](4,-3) node {$\bullet$};
\draw[black, thick, ->-=.5] (4,-3) to (3.5,-3.5);
\draw[black, thick, ->-=.5] (4,-3) to (4.5,-3.5);
\draw[black, thick, ->-=.5] (3.5,-3.5) to (4,-4);
\draw[black, thick, ->-=.5] (4.5,-3.5) to (4,-4);
\draw[black, thick, ->-=.5] (3.5,-3.5) to [bend left](4.5,-3.5);
\draw[black, thick, ->-=.5] (4.5,-3.5) to [bend left](3.5,-3.5);
%Les sommets ou neurones
\draw (7,-4) node {$\bullet$};
\draw [ForestGreen](6.5,-3.5) node {$\bullet$};
\draw [ForestGreen](7.5,-3.5) node {$\bullet$};
\draw [Blue](7,-3) node {$\bullet$};
\draw[black, thick, ->-=.5] (7,-3) to [bend left](6.5,-3.5);
\draw[black, thick, ->-=.5] (7,-3) to [bend right](7.5,-3.5);
\draw[black, thick, ->-=.5] (6.5,-3.5) to (7,-4);
\draw[black, thick, ->-=.5] (7.5,-3.5) to (7,-4);
\draw[black, thick, ->-=.5] (6.5,-3.5) to [bend left](7,-3);
\draw[black, thick, ->-=.5] (7.5,-3.5) to [bend right](7,-3);
%Les sommets ou neurones
\draw (10,-4) node {$\bullet$};
\draw [ForestGreen](9.5,-3.5) node {$\bullet$};
\draw [ForestGreen](10.5,-3.5) node {$\bullet$};
\draw [Blue](10,-3) node {$\bullet$};
\draw[black, thick, ->-=.5] (10,-3) to [bend left](9.5,-3.5);
\draw[black, thick, ->-=.5] (10,-3) to [bend right](10.5,-3.5);
\draw[black, thick, ->-=.5] (9.5,-3.5) to (10,-4);
\draw[black, thick, ->-=.5] (10.5,-3.5) to (10,-4);
\draw[black, thick, ->-=.5] (9.5,-3.5) to [bend left](10,-3);
\draw[black, thick, ->-=.5] (10.5,-3.5) to [bend right](10,-3);
\draw[black, thick, ->-=.5] (9.5,-3.5) to [bend left](10.5,-3.5);
\draw[black, thick, ->-=.5] (10.5,-3.5) to [bend left](9.5,-3.5);
%Les sommets ou neurones
\draw (1,-6) node {$\bullet$};
\draw [Blue](1,-5.5) node {$\bullet$};
\draw [ForestGreen](0.5,-5) node {$\bullet$};
\draw [ForestGreen](1.5,-5) node {$\bullet$};
\draw[black, thick, ->-=.5] (1,-5.5) to (1,-6);
\draw[black, thick, ->-=.5] (0.5,-5) to (1,-5.5);
\draw[black, thick, ->-=.5] (1.5,-5) to (1,-5.5);
%Les sommets ou neurones
\draw (4,-6) node {$\bullet$};
\draw [Blue](4,-5.5) node {$\bullet$};
\draw [ForestGreen](3.5,-5) node {$\bullet$};
\draw [ForestGreen](4.5,-5) node {$\bullet$};
\draw[black, thick, ->-=.5] (4,-5.5) to (4,-6);
\draw[black, thick, ->-=.5] (3.5,-5) to (4,-5.5);
\draw[black, thick, ->-=.5] (4.5,-5) to (4,-5.5);
\draw[black, thick, ->-=.5] (3.5,-5) to [bend left](4.5,-5);
\draw[black, thick, ->-=.5] (4.5,-5) to [bend left](3.5,-5);
%Les sommets ou neurones
\draw (7,-6) node {$\bullet$};
\draw [Blue](7,-5.5) node {$\bullet$};
\draw [ForestGreen](6.5,-5) node {$\bullet$};
\draw [ForestGreen](7.5,-5) node {$\bullet$};
\draw[black, thick, ->-=.5] (7,-5.5) to (7,-6);
\draw[black, thick, ->-=.5] (6.5,-5) to[bend left] (7,-5.5);
\draw[black, thick, ->-=.5] (7.5,-5) to [bend left](7,-5.5);
\draw[black, thick, ->-=.5] (7,-5.5) to [bend left](7.5,-5);
\draw[black, thick, ->-=.5] (7,-5.5) to [bend left](6.5,-5);
%Les sommets ou neurones
\draw (10,-6) node {$\bullet$};
\draw [Blue](10,-5.5) node {$\bullet$};
\draw [ForestGreen](9.5,-5) node {$\bullet$};
\draw [ForestGreen](10.5,-5) node {$\bullet$};
\draw[black, thick, ->-=.5] (10,-5.5) to (10,-6);
\draw[black, thick, ->-=.5] (9.5,-5) to[bend left] (10,-5.5);
\draw[black, thick, ->-=.5] (10.5,-5) to [bend left](10,-5.5);
\draw[black, thick, ->-=.5] (10,-5.5) to [bend left](10.5,-5);
\draw[black, thick, ->-=.5] (10,-5.5) to [bend left](9.5,-5);
\draw[black, thick, ->-=.5] (9.5,-5) to [bend left](10.5,-5);
\draw[black, thick, ->-=.5] (10.5,-5) to [bend left](9.5,-5);
\end{tikzpicture}\\
\end{center}
\section{Main results}
We now demonstrate the main results of the article, namely: {\bf 1)} An exact expression for the total number of networks {\bf 2)} A recursive formula yielding the exact value of $N(m,n_1,n_2,k;x)$, {\bf 3)} Sharp analytic bounds for $N^*(m,n_1,n_2,k)$, {\bf 4)} Sharp analytic bounds for $N(m,n_1,n_2,k)$.
\subsection{Exact series}
\subsubsection{Total number of networks}
\begin{theorem}\label{theoX}
$$
N^*(m,n_1,n_2,k;x)=\sum_{\lambda_1\vdash n_1\atop{\lambda_2\vdash n_2}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2; x)
$$
with
$$
G(\lambda;x)=F(\lambda;x)D(\lambda|(1^m);x)D(\lambda|(1^k);x).
$$
\end{theorem}
\begin{proof}
Since the intermediate nodes of type I are distinguishable from these of type II, we only consider partitions $\lambda_1\cup \lambda_2$ with $\lambda_1\vdash n_1$ and $\lambda_2\vdash n_2$. In the expression $G(\lambda;x)$, the factor $F(\lambda;x)$ stands for the action of a permutation of type $\lambda$ on connections between intermediate nodes. The factor $D(\lambda|(1^m);x)$ stands for the action of a permutation of type $\lambda$ on the connections from the $m$ input nodes to the intermediate nodes. Finally, the factor $D(\lambda|(1^k);x)$ stands for the action of a permutation of type $\lambda$ on the connections from the intermediate nodes to the $k$ output nodes.
\end{proof}
\subsubsection{Number of admissible networks}
The following result gives an effective recursive formula for $N(m,n_1,n_2,k;x)$ which we implemented in Python to obtain the values of $N( m,n_1,n_2,k)=N(m,n_1,n_2,k;1)$ given in Table 1.
\begin{theorem}\label{theoXX}
\begin{equation}
N(m,n_1,n_2,k;x)=\sum_{\lambda_1\vdash n_1\atop{\lambda_2\vdash n_2}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}C(\lambda_1\cup \lambda_2;x)
\end{equation}
where $C(\lambda;x)$ can be obtained recursively through the following equation
\begin{eqnarray}\label{eq:t11}
C(\lambda;x)&=&G(\lambda;x)-\sum_{\mu_1 \cup \mu_2 \cup \mu_3\cup \mu_4 =\lambda\atop{\mu_2\cup \mu_3\cup \mu_4\not=\phi}}\Omega\left(\mu_1,\mu_2,\mu_3,\mu_4\right)C(\mu_1;x)\\
& &T(\mu_2|\mu_1\cup (1^m);x)T(\mu_3|\mu_1\cup (1^k);x)F(\mu_4;x)D(\mu_2\cup \mu_4|\mu_3;x)D(\mu_2|\mu_4;x)\nonumber
\end{eqnarray}
with
$C(\phi;x)=1.$
\end{theorem}
\begin{proof}
Theorem \ref{theoXX} follows rather directly from the definitions of the auxiliary functions. The intermediate nodes of a network can be divided in four disjoint groups (regardless if they are of type I or type II) as follows:
\begin{enumerate}
\item Intermediate nodes of the first group, tagged as `connected' intermediate nodes, both receive a path from an input node and send a path to an output neuron. Other intermediate nodes are tagged as `unconnected'.
\item Unconnected intermediate nodes in the second group receive a path from an input node but send none to an output node.
\item Unconnected intermediate nodes of the third group send a path to an output node but receive no path from an input node.
\item Unconnected intermediate nodes of the fourth group neither receive a path from an input node nor send a path to an output node.
\end{enumerate}
We write $\mu_j$, $1\le j\le 4$ as the types of the permutations acting on the intermediate nodes belonging to group $j$. Assuming that a network is not counted by $N(m,n_1,n_2,k)$ then $\mu_2\cup \mu_3\cup \mu_4 \not= \phi$ since at least one intermediate node doesn't belong to the first group. Equation ($\ref{eq:t11}$) subtracts from all possible networks these in which some intermediate nodes are unconnected.
The factor $C(\mu_1;x)$ corresponds to the action of a permutation of type $\mu_1$ on the connections between two `connected' intermediate nodes as well as on connections between connected intermediate nodes and input/output nodes. The factor $T(\mu_2|\mu_1\cup (1^m);x)$ accounts for the action of permutations types $\mu_1$ and $\mu_2$ on the connections between two intermediate nodes of the second group as well as on connections from a connected intermediate node or from an input node to intermediate nodes of the second group. The factor $T(\mu_3|\mu_1\cup (1^k);x)$ accounts for the action of permutations of types $\mu_1$ and $\mu_3$ on connections between two intermediate nodes of group 3 as well as on connections from intermediate nodes of group 3 to connected intermediate nodes or to output nodes. The factor $F(\mu_4;x)$ accounts for the action of the permutation type $\mu_4$ on connections between two intermediate nodes of group 4. The factor $D(\mu_2\cup \mu_4|\mu_3;x)$ accounts for the action of permutation types $\mu_2,\mu_3$ and $\mu_4$ on connections from intermediate nodes of group 3 to intermediate nodes of either group 2 or group 4. Finally, the factor $D(\mu_2|\mu_4;x)$ accounts for the action of permutation types $\mu_2$ and $\mu_4$ on connections from an intermediate node of group 4 to an intermediate node of group 2. Given that in formula ($\ref{eq:t11}$), the partition $\mu_1$ is a partition of an integer which is strictly smaller than the integer partitioned by $\lambda$, the recurrence process is convergent. The argument of the proof is illustrated in the schematic below.
\end{proof}
\begin{center}
\begin{tikzpicture}
\draw (0,1) circle(0.6);
\node[align=left] at (0,1) {Input};
\draw (5,1) circle (0.6);
\node[align=left] at (5,1) {Output};
\draw (2.5,0) circle (0.6);
\node[align=left] at (2.5,0) {Gr. 1};
\draw (1.5,2) circle (0.6);
\node[align=left] at (1.5,2) {Gr. 2};
\draw (3.5,2) circle (0.6);
\node[align=left] at (3.5,2) {Gr. 3};
\draw (2.5,3.5) circle (0.6);
\node[align=left] at (2.5,3.5) {Gr. 4};
\draw[black, thick, ->-=.5] (0.4,0.6) to (1.9,0);
\draw[black, thick, ->-=.5] (3.1,0) to (4.6,0.6);
\draw[black, thick, ->-=.5] (0.6,1.2) to (1.1,1.6);
\draw[black, thick, ->-=.5] (2.9,2) to (2.1,2);
\draw[black, thick, ->-=.5] (4,1.7) to (4.5,1.3);
\draw[black, thick, ->-=.5] (2.2,0.5) to (1.5,1.4);
\draw[black, thick, ->-=.5] (3.4,1.4) to (2.8,0.5);
\draw[black, thick, ->-=.5] (3.2,2.5) to (2.8,3);
\draw[black, thick, ->-=.5] (2.2,3) to (1.8,2.5);
\end{tikzpicture}
\end{center}
\subsection{Asymptotic series}
\subsubsection{Total number of networks}
Asymptotic formulas for graph enumeration have been obtained for several graph enumeration problems including \cite{Hofstad2006}. In this direction, Theorem \ref{theoX} allows us to derive the following approximation for $N^*(m,n_1,n_2,k)$.
\begin{theorem}\label{theoAX}
\begin{eqnarray*}
N^*(m,n_1,n_2,k)&=&\frac{2^{(n+m+k-1)n}}{n_1!n_2!}\Bigg(1+\left({n_1\choose 2}+{n_2\choose 2}\right)2^{-2n-m-k+3}\\
& &+\left(2{n_1\choose 3}+2{n_2\choose 3}\right)2^{-4n-2m-2k+8}\\
& &+\left({n_1 \choose 2}{n_2\choose 2}+3{n_1\choose 4}+3{n_2\choose 4}\right)2^{-4n-2m-2k+10}\Bigg)\\
& &+O\left(\frac{2^{(n+m+k-1)n}}{n_1!n_2!}n^72^{-6n-3m-3k}\right).
\end{eqnarray*}
\end{theorem}
\begin{proof} The proof is technical but rather straightforward. The idea behind it is that the main contributions to $N^*(m,n_1,n_2,k)$ come from the permutations leaving all or almost all intermediate nodes fixed. We will quantify exactly these contributions and show that the others can be neglected. The right hand side in the statement of Theorem \ref{theoAX} could be further expanded, but we limited ourselves to four terms for the sake of simplicity. If we define the partition $\lambda_1$ by $\lambda_1:=(1^n)$, we have
\begin{equation}\label{eq:term1}
\omega(\lambda_1)=1 \mbox{ and } G(\lambda_1;x)=(1+x)^{(n+m+k-1)n}
\end{equation}
so the contribution of the identity permutation to $N^*(m,n_1,n_2,k)$ is equal to $\frac{2^{(m+n+k-1)n}}{n_1!n_2!}$. We will now show that the contributions of other permutation types are small compared to this.
Let $\lambda_2=(1^{n-2},2^1)$, we have
\begin{eqnarray}\label{eq:term2}
& &\sum_{\mu_1\vdash n_1,\mu_2\vdash n_2\atop{\mu_1\cup \mu_2=\lambda_2}}\omega(\mu_1)\omega(\mu_2)={n_1\choose 2}+{n_2\choose 2}\\
\mbox{ and }& & G(\lambda_2;x)=(1+x)^{(n+m+k-3)(n-2)}(1+x^2)^{1+2(n-2)+m+k}\nonumber.
\end{eqnarray}
Let $\lambda_3=(1^{n-3},3^1)$, we have
\begin{eqnarray}\label{eq:term3}
& &\sum_{\mu_1\vdash n_1, \mu_2\vdash n_2\atop{\mu_1\cup \mu_2=\lambda_3}}\omega(\mu_1)\omega(\mu_2)=2{n_1\choose 3}+2{n_2\choose 3} \\
\mbox{ and } & & G(\lambda_3;x)=(1+x)^{(n+m+k-4)(n-3)}(1+x^3)^{2+2(n-3)+m+k}\nonumber.
\end{eqnarray}
Let $\lambda_4=(1^{n-4},2^2)$, we have
\begin{eqnarray}\label{eq:term4}
& &\sum_{\mu_1\vdash n_1, \mu_2 \vdash n_2\atop{\mu_1\cup \mu_2=\lambda_4}}\omega(\mu_1)\omega(\mu_2)={n_1\choose 2}{n_2\choose 2}+3{n_1\choose 4}+3{n_2\choose 4}\\
\mbox{ and } & &G(\lambda_4;x)=(1+x)^{(n+m+k-5)(n-4)}(1+x^2)^{6+4(n-4)+2m+2k}.\nonumber
\end{eqnarray}
Let $\lambda_5=(1^{n-4},4^1)$, we have
\begin{eqnarray}\label{eq:term5}
& &\sum_{\mu_1\vdash n_1,\mu_2\vdash n_2\atop{\mu_1\cup \mu_2=\lambda_5}}\omega(\mu_1)\omega(\mu_2)=6{n_1\choose 4}+6{n_2\choose 4}\\
\mbox{ and }& & G(\lambda_5;x)=(1+x)^{(n+m+k-5)(n-4)}(1+x^4)^{3+2(n-4)+m+k}.\nonumber
\end{eqnarray}
Let $\lambda_6=(1^{n-5},2,3)$, we have
\begin{eqnarray}\label{eq:term6}
\sum_{\mu_1\vdash n_1, \mu_2\vdash n_2\atop{\mu_1\cup \mu_2=\lambda_6}}\omega(\mu_1)\omega(\mu_2)&=&2{n_1\choose 2,3}+2{n_2\choose 2,3}+2{n_1\choose 2}{n_2\choose 3}+2{n_1\choose 3}{n_2\choose 2}\\
G(\lambda_6;x)&=&(1+x)^{(n+m+k-6)(n-5)}(1+x^2)^{1+2(n-5)+m+k}(1+x^3)^{2+2(n-5)+m+k}\nonumber\\
& &\times (1+x^6)^{2}\nonumber.
\end{eqnarray}
Finally, let $\lambda_7=(1^{n-5},5)$. We have
\begin{eqnarray}\label{eq:term7}
& &\sum_{\mu_1\vdash n_1, \mu_2\vdash n_2\atop{\mu_1\cup \mu_2=\lambda_7}}\omega(\mu_1)\omega(\mu_2)=24{n_1\choose 5}+24{n_2\choose 5}\\
\mbox{ and }& & G(\lambda_7;x)=(1+x)^{(n+m+k-6)(n-5)}(1+x^5)^{4+2(n-5)+m+k}.\nonumber
\end{eqnarray}
For $h$ a nonnegative integer, we define the set $\gamma(h)$ as
$$
\gamma(h):=\{\lambda: \exists \mu_1\vdash n_1, \mu_2\vdash n_2, \mu_1\cup \mu_2=\lambda, a_1^{\lambda}=n-h\}
$$
that is the set of partitions of $n$ with exactly $n-h$ cycles of length 1. From equation $(\ref{eq:term1})$ we have
\begin{equation}\label{eq:new1}
\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(0)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)=\frac{1}{n_1!n_2!}2^{(n+m+k-1)n}.
\end{equation}
From equation ($\ref{eq:term2}$) we have
\begin{equation}
\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(2)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)=\frac{{n_1\choose 2}+{n_2\choose 2}}{n_1!n_2!}2^{(n+m+k-1)n-2n-m-k+3}.
\end{equation}
From equation ($\ref{eq:term3}$) we have
\begin{equation}
\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(
3)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)=\frac{2{n_1\choose 3}+2{n_2\choose 3}}{n_1!n_2!}2^{(n+m+k-1)n-4n-2m-2k+8}.
\end{equation}
From equations ($\ref{eq:term4}$) and ($\ref{eq:term5}$) we have
\begin{eqnarray}
& &\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(
4)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)\nonumber\\
&=&\frac{{n_1\choose 2}{n_2\choose 2}+3{n_1\choose 4}+3{n_2\choose 4}}{n_1!n_2!}2^{(n+m+k-1)n-4n-2m-2k+10}\nonumber\\
& &+\frac{6{n_1\choose 4}+6{n_2\choose 4}}{n_1!n_2!}2^{(n+m+k-1)n-6n-3m-3k+15}.
\end{eqnarray}
While finally from equations ($\ref{eq:term6}$) and ($\ref{eq:term7}$) we have:
\begin{eqnarray}\label{eq:new2}
& &\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(
5)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)\nonumber\\
&=&\frac{2{n_1\choose 2,3}+2{n_2\choose 2,3}+2{n_1\choose 2}{n_2\choose 3}+2{n_1\choose 3}{n_2\choose 2}}{n_1!n_2!}2^{(n+m+k-1)n-6n-3m-3k+13}\nonumber\\
& &+\frac{24{n_1\choose 5}+24{n_2\choose 5}}{n_1!n_2!}2^{(n+m+k-1)n-8n-4m-4k+24}.
\end{eqnarray}
From equations (\ref{eq:new1}-\ref{eq:new2}) we obtain
\begin{eqnarray}\label{eq:final1}
N^*(m,n_1,n_2,k)&=&\frac{2^{(n+m+k-1)n}}{n_1!n_2!}\Bigg(1+\left({n_1\choose 2}+{n_2\choose 2}\right)2^{-2n-m-k+3}\nonumber\\
& &+\left(2{n_1\choose 3}+2{n_2\choose 3}\right)2^{-4n-2m-2k+8}\nonumber\\
& &+\left({n_1 \choose 2}{n_2\choose 2}+3{n_1\choose 4}+3{n_2\choose 4}\right)2^{-4n-2m-2k+10}\Bigg)\nonumber\\
& &+O\left(\frac{2^{(n+m+k-1)n}}{n_1!n_2!}n^52^{-6n-3m-3k}\right)\nonumber\\
& &+\sum_{h=6}^n\sum_{\lambda_1\vdash n_1,\lambda_2\vdash n_2\atop{\lambda_1\cup\lambda\in\gamma(h)}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup\lambda_2;1).
\end{eqnarray}
Assuming that a permutation $s$ is of type $\lambda \in\gamma(h)$, we can count the number of connections that will not be fixed by $s$. We have that $h(m+k)$ connections between intermediate nodes and input/output nodes will have orbits of length greater than one while $h(h-1)$ connections between two intermediate nodes which are not fixed by $s$ will have orbits of length greater than one. Finally, $2h(n-h)$ connections between an intermediate node that is fixed by $s$ and one that is not fixed by $s$ will be in orbits of length greater than one. The total number of connections in orbits of length greater than one is thus equal to
$$
h(m+k)+h(h-1)+2h(n-h).
$$
Given that the total number of connections is equal to $(n+m+k-1)n$, the number of connection orbits induced by a permutation of type $\lambda \in \gamma(h)$ will thus be less or equal than
$$
(n+m+k-1)n-\frac{h(m+k)+h(h-1)+2h(n-h)}{2}.
$$
Therefore
\begin{equation}\label{eq:reste1}
\lambda\in \gamma(h)\Rightarrow G(\lambda;1)\le 2^{(m+n+k-1)n-h(m+k+2n-h-1)/2}.
\end{equation}
We also have
\begin{equation}\label{eq:reste2}
\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(h)} }\omega(\lambda_1)\omega(\lambda_2)\le \sum_{h_1=0}^{\min(n_1,h)}{n_1\choose h_1}{n_2\choose h-h_1}h_1!(h-h_1)!\le \sum_{h_1=0}^{\min(n_1,h)}n^h\le n^{h+1}.
\end{equation}
Equations ($\ref{eq:reste1}$) and ($\ref{eq:reste2}$) imply
\begin{eqnarray}\label{eq:final2}
\sum_{h=6}^n\sum_{\lambda_1 \vdash n_1,\lambda_2 \vdash n_2\atop{\lambda_1\cup \lambda_2 \in \gamma(h)} }\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup \lambda_2;1)&\le& \frac{2^{(m+n+k-1)n}}{n_1!n_2!}\sum_{h=6}^n n^{h+1}2^{-h(m+k+2n-h-1)/2}\nonumber\\
&\ll& \frac{2^{(m+n+k-1)n}}{n_1!n_2!}\sum_{h=6}^n n^{h+1}2^{-h(m+k+n)/2}\\
&=&O\left(\frac{2^{(n+m+k-1)n}}{n_1!n_2!}n^72^{-6n -3m-3k}\right).\nonumber
\end{eqnarray}
And the proof of Theorem \ref{theoAX}
follows from (\ref{eq:final1}) and (\ref{eq:final2}).
\end{proof}
\subsubsection{Number of admissible networks}
Probabilistic interpretation shows that assuming that each potential connection is equally likely to exist or not, almost all graphs are connected as the number of nodes increases \cite{Erdos 1959}. The next result shows that a similar phenomenon occurs in the context of the problem investigated in the present paper as $N(m,n_1,n_2,k)\sim N^*(m,n_1,n_2,k)$ when the number of nodes increases.
\begin{theorem}\label{theoAXX}
Assuming that $\min(m,k)>1$, we have
\begin{eqnarray*}
N(m,n_1,n_2,k)&=& \frac{1}{n_1!n_2!}2^{(n+m+k-1)n}-\frac{n}{n_1!n_2!}2^{(m+n+k-1)n-n-m+1}\\
& &-\frac{n}{n_1!n_2!}2^{(m+n+k-1)n-n-k+1}+O\left(\frac{2^{(n+m+k-1)n}n^3}{n_1!n_2!}\left(2^{-2n-\min(m,k)}\right)\right).
\end{eqnarray*}
\end{theorem}
\begin{proof}
We will now provide a proof of Theorem \ref{theoAXX} following the same general idea as in the proof of Theorem \ref{theoAX} namely that the contribution to $N(m,n_1,n_2,k;x)$ from permutations other than the identity permutation are relatively small. Furthermore, when subtracting the networks not satisfying the connectivity conditions, the main contribution to the subtracted quantity comes from the case where only one intermediate node belongs to the unconnected group.
From equations ($\ref{eq:reste1}$) and ($\ref{eq:reste2}$) we have:
\begin{eqnarray}\label{eq:Th531}
& &\sum_{h\ge 2}\sum_{\lambda_1\vdash n_1, \lambda_2\vdash n_2\atop{\lambda_1\cup\lambda_2\in \gamma(h)}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}C(\lambda_1\cup\lambda_2;1)\le \sum_{h\ge 2}\sum_{\lambda_1\vdash n_1, \lambda_2\vdash n_2\atop{\lambda_1\cup\lambda_2\in \gamma(h)}}\frac{\omega(\lambda_1)\omega(\lambda_2)}{n_1!n_2!}G(\lambda_1\cup\lambda_2;1)\nonumber\\
&\le& \frac{2^{(m+n+n+k-1)n}}{n_1!n_2!}\sum_{h=2}^{n}n^{h+1}2^{-h(m+k+2n-h-1)/2}\nonumber\\
&=&O\left(\frac{2^{(n+m+k-1)n}}{n_1!n_2!}n^32^{-2n-m-k}\right).
\end{eqnarray}
This means that for the purpose of Theorem \ref{theoAXX}, all permutations except the identity can be neglected. For the remainder of the proof, we will thus assume $\lambda=(1^n)$. We have from Theorem \ref{theoXX}
\begin{eqnarray}\label{eq:t531}
C((1^n);x)&=&G((1^n);x)\\
&-&\sum_{h=1}^{n}\sum_{\mu_1\cup\mu_2\cup\mu_3\cup \mu_4=(1^n)\atop{\mu_2\cup\mu_3\cup\mu_3=(1^h)}}C(\mu_1;x)\Omega(\mu_1,\mu_2,\mu_3,\mu_4)U((\mu_2,\mu_3,\mu_4),\mu_1;x)\nonumber
\end{eqnarray}
where the sum on $h$ is performed over all the possible number of unconnected intermediate nodes and
\begin{eqnarray*}
&U((\mu_2,\mu_3,\mu_4),\mu_1;x)=\\
&T(\mu_2|\mu_1\cup (1^m);x)T(\mu_3|\mu_1\cup (1^k);x)F(\mu_4;x)D(\mu_2\cup \mu_4|\mu_3;x)D(\mu_2|\mu_4;x).
\end{eqnarray*}
We now show that the contribution of instances with more than 1 unconnected intermediate node is relatively small.
Assuming that $\mu_2\cup\mu_3\cup\mu_4 =(1^h)$ we have that $\mu_1=(1^{n-h})$. The number of forbidden connections is at least
$$
h((n-h)+\min(m,k)).
$$
Therefore
\begin{equation}
C(\mu_1;1)U((\mu_2,\mu_3,\mu_4),\mu_1;1)\le 2^{(n+m+k-1)n}2^{-h((n-h)+\min(m,k))}.
\end{equation}
The number of ways of choosing the $h$ unconnected intermediate nodes is equal to ${n\choose h}$. The number of ways of choosing which of these unconnected intermediate nodes will not send a path to an output neuron is equal to $2^h$. We thus have
\begin{eqnarray}\label{eq:t532}
& &\sum_{\mu_1\cup\mu_2\cup \mu_3\cup\mu_4=(1^n)\atop{\mu_2\cup\mu_2 \cup\mu_3=(1^h)}}C((1^{n-h});1)\Omega(\mu_1,\mu_2,\mu_3,\mu_4)U((\mu_2,\mu_3,\mu_4),(1^{n-h}))\\
&\le & 2^{(n+m+k-1)n}{n\choose h}2^h2^{-h((n-h)+\min(m,k))}.\nonumber
\end{eqnarray}
Given that under the assumption that $\min(m,k)>1$ we have
\begin{equation*}
\sum_{h=2}^{n}{n\choose h}2^{-h((n-h)+\min(m,k)-1)}=O\left(n^22^{-2n-2\min(m,k)}\right)
\end{equation*}
we can obtain the following from equations ($\ref{eq:t531}$) and ($\ref{eq:t532}$)
\begin{eqnarray}\label{eq:Fin1}
C((1^n);1)&=&2^{(n+m+k-1)n}\\
& &-\sum_{\mu_1\cup\mu_2\cup \mu_3\cup\mu_4=(1^n)\atop{\mu_2\cup\mu_2 \cup\mu_3=(1)}}C(\mu_1;1)\Omega((\mu_1,\mu_2,\mu_3,\mu_4),(1^n))U((\mu_2,\mu_3,\mu_4),\mu_1;1)\nonumber\\
& &+O\left(n^2 2^{(n+m+k-1)n-2n-2\min(m,k)}\right)\nonumber.
\end{eqnarray}
Assuming that only one intermediate node is unconnected, this neuron can be in either group 2, group 3 or group 4. In each instance $\Omega((\mu_1,\mu_2,\mu_3,\mu_4),(1^n))=n$. If the unconnected intermediate node is in group 2, we have $U(((1),\phi,\phi),(1^{n-1});1)=2^{n-1+k}$, if it is in group 3 we have, $U((\phi,(1),\phi),(1^{n-1});1)=2^{n-1+m}$ while if it is in group 4, $U((\phi,\phi,(1)),(1^{n-1});1)=1$. Using
$$
C((1^{n-1});1)=2^{(n+m+k-2)(n-1)}\left(1+O(n2^{-n})\right)
$$
this yields
\begin{eqnarray}\label{eq:Fin2}
& &\sum_{\mu_1\cup\mu_2\cup \mu_3\cup\mu_4=(1^n)\atop{\mu_2\cup\mu_2 \cup\mu_3=(1)}}C(\mu_1;1)\Omega(\mu_1,\mu_2,\mu_3,\mu_4)U((\mu_2,\mu_3,\mu_4),\mu_1;1)\\
&=&2^{(n+m+k-1)n}n\left(2^{-(n-1+k)}+2^{-(n-1+m)}+2^{-(2n-2+m+k)}\right)+O\left(n^2 2^{-2n-m-k}\right)\nonumber.
\end{eqnarray}
From equations ($\ref{eq:Fin1}$) and ($\ref{eq:Fin2}$)we have
\begin{eqnarray}\label{eq:Fin23}
C((1^n);1)&=&2^{(n+m+k-1)n}\left(1-n\left(2^{-m-n+1}+2^{-k-n+1}\right)\right)\\
& &+O\left(2^{(n+m+k-1)n}n^2\left(2^{-2n-\min(m,k)}\right)\right)\nonumber.
\end{eqnarray}
The proof of Theorem \ref{theoAXX} then follows directly from ($\ref{eq:Th531}$) and ($\ref{eq:Fin23}$).
\end{proof}
\section{Algorithms and numerical results}\label{numerics}
\subsection{An algorithm to obtain $N(m,n_1,n_2,k)$}
We used the tools developed in the previous sections to compute $N(m,n_1,n_2,k)$ exactly. We implemented the exact formula in Python 3.2 and obtained the values of $N(m,n_1,n_2,k)$ for a total number of up to twenty neurons (see table). The value of $N(5,5,5,5)$ is larger than $10^{50}$ and was obtained in roughly 1 hour on a personal computer, as follows from Theorem \ref{theoAX} and Theorem \ref{theoAXX}, the ratio $N^*(m,n_1,n_2,k)/N(m,n_1,n_2,k)$ tends to 1 as the total number of nodes tends to infinity. A potential way to improve the running time of our algorithm would be to generate tables for the recursively defined functions $C$ and $T$, avoiding multiple computations of the same values, but we didn't observe any practical gain in computation power.
{
\centering
\begin{table}[!htbp]
\begin{center}
%\renewcommand{\arraystretch}{2}
\begin{tabular}{ c | l }
$(m,n_1,n_2,k)$ & $N(m,n_1,n_2,k)$, $\; (N/N^*)$ \\
\hline
$(1,2,0,1)$ & 10, \;(0.27778)\tabularnewline[10pt]
$(1,1,1,1)$ & 18, \;(0.28125)\tabularnewline[10pt]
$(2,1,1,1)$ & 102, \;(0.39844)\tabularnewline[10pt]
$(1,2,2,1)$ & 142982, \;(0.53653)\tabularnewline[10pt]
$(2,2,2,1$ & 2694498, \;(0.63728)\tabularnewline[10pt]
$(2,2,2,2)$ & 51866550, \;(0.76982)\tabularnewline[10pt]
$(3,2,2,2)$ & 884419998, \;(0.82206)\tabularnewline[10pt]
$(2,3,2,2)$ & 78187273596, \;(0.85166)\tabularnewline[10pt]
$(2,3,3,2)$ & 455361619832116,\; (0.90933) \tabularnewline[10pt]
$(3,3,3,2)$ & 29832388255291748,\; (0.93118) \tabularnewline[10pt]
$(3,3,3,3)$ & 1955846953939134676,\; (0.95407) \tabularnewline[10pt]
$(4,3,3,3)$ & 126638097270451278260,\; (0.96531)\tabularnewline[10pt]
$(3,4,3,3)$ & 13070591789841196037595,\; (0.97299)\tabularnewline[10pt]
$(3,4,4,3)$ & 34667056916819114999482177172,\; (0.98449)\tabularnewline[10pt]
$(4,4,4,3)$ & 8909372636328795000677433447860,\; (0.98834)\tabularnewline[10pt]
$(4,4,4,4)$ & 2289737167377258878090980801663700,\; (0.9922)\tabularnewline[10pt]
$(5,4,4,4)$ & 5873165118545144771100744785795825940,\; (0.99415)\tabularnewline[10pt]
$(4,5,4,4)$ & 7709370994349444222506223806797006161890,\; (0.99561)\tabularnewline[10pt]
$(4,5,5,4)$ & 103675627514386351527569061315028472168329351376,\; (0.99756)\tabularnewline[10pt]
$(5,5,5,4)$ & 106228616378925332256701907433326875374372326146256,\; (0.99817)\tabularnewline[10pt]
$(5,5,5,5)$ & 108844524790336539487420588884391944954279893619583184,\; (0.99878)
\end{tabular}
\end{center}
$k:=$ number of input nodes\\
$n_1:=$ number of intermediate nodes of type I\\
$n_2:=$ number of intermediate nodes of type II\\
$m:=$ number of output nodes
\caption{Total number of connected networks}
\label{tab:Networks}
\end{table}
}
\subsection{An algorithm to generate the networks}
While counting the networks satisfying a given set of conditions is interesting in its own right, we also provide an algorithm to generate the adjacency matrix associated with the networks counted by $N(m,n_1,n_2,k)$. As discussed below, in future works, generating these networks will allow us to perform several tests on the relationships between their structure and dynamical behaviour.
With each network with $N$ nodes, we associate the adjacency matrix $A=(A_{i,j})_{i,j=1}^N$, which is such that $A_{i,j}=1$ if there is a connection from node $i$ to node $j$ and $A_{i,j}=0$ otherwise. In our case, $N=m+n+k$ and we adopt the following convention regarding the labelling of the nodes: $i=1,\ldots,m$ for the input nodes, $i=m+1,\ldots,m+n$ for the intermediate nodes, and $i=m+n+1,\ldots, m+n+k$ for the output nodes. We define $\cal{E}$ as the following set
$$
{\cal E}=\{ (i,j) : \mbox{ the connection } i\rightarrow j \mbox{ is permitted } \}.
$$
For every pair $(i,j)$, we define the rank of $(i,j)$ as
$$
\rnk(i,j)=\#\{(i_1,j_1)\in {\cal E}:i_1*0$ iff there is a path from the $s$-th node to the $t$-th neuron.
Lastly, to avoid counting isomorphisms of the same network, we make a list of all permutation matrices different from the identity that permute intermediate nodes of type I among themselves and intermediate nodes of type II among themselves. For all such permutation matrices $P$, the matrix $P^TAP$ corresponds to the adjacency matrix of a network which is an isomorphism of a network corresponding to the matrix $A$. We again implemented this in Python as described in Algorithm~1, and on a personal computer we could generate billions of networks in roughly 1 hour.
While many efficient algorithms that check network connectivity have been developed, such as those based on breadth-first search or depth-first search \cite{Jungnickel}, we preferred to use the procedure explained above because it is the simplest, it avoids passing from adjacency matrices to lists of edges, and it does not have a significant impact on the running time of the whole algorithm. The most costly part of the algorithm is obviously the last one, which generates permutations of an adjacency matrix. Consequently, any significant algorithmic improvement would have to involve a conceptually new way to make sure that isomorphism of admissible networks are counted only once.
\begin{algorithm}
\caption{Generating networks satisfying the connectivity conditions}
$ {}$\\
\noindent
{\bf INPUT:} Integers $m,n_1,n_2,k$.
\noindent
{\bf OUTPUT:} A Boolean $B$ of length $2^{n(n+m+k-1)}$, where $n=n_1+n_2$, such that $B_j= \truu$ iff the two following conditions are satisfied: 1) $j$ is the index of an admissible network, 2) If $j_1$ is the index of an isomorphic network, then $j\le j_1$.
\noindent\rule{\textwidth}{0.2pt}
\noindent
{\bf INITIALIZATION}
\noindent
Set $M=2^{n(n+m+k-1)}$.
\noindent
Initialize $B$, a Boolean vector of length $M$ with all its values equal to None.
\noindent
Let $\perm$ be the list of all the permutation matrices different from the identity permuting intermediate nodes of type I and intermediate nodes of type II.
\noindent\rule{\textwidth}{0.2pt}
\noindent
{\bf IDENTIFY ADMISSIBLE MATRICES}
\noindent
for $j$ in range $(0,M)$:
\quad if $B_j$ is equal to None:
\quad \quad compute the adjacency matrix $A$ corresponding to the index $j$
\quad \quad compute the connectivity matrix $ \connect =\sum_{h=0}^{n+1}A^h$
\quad\quad set $\chek =1$
\quad\quad for $s$ in range $(0,n)$:
\quad\quad\quad update Check with
\quad\quad\quad$ \chek = \chek\left(\sum_{u=1}^m \connect_{u,m+s}\right)\left(\sum_{u=1}^k \connect_{m+s,m+n+u}\right)$
\quad\quad if $\chek >0:$
\quad \quad \quad $B_j=$ True
\quad \quad else:
\quad \quad \quad $B_j=$ False
\quad\quad for $P$ in $\perm$:
\quad\quad\quad compute $\ell$, the index associated with the matrix $P^TAP$
\quad\quad\quad set $B_\ell=$ False
\end{algorithm}
\section{Conclusion} Beyond the stand alone interest of enumerating digraphs with a given set of sources and sinks, the present paper is relevant to structure enumeration in scientific fields such as neuroscience and the study of finite automata. In neuroscience, the encoding capacity of a network depends on the number of possible combinations of network structures, transfer functions and synaptic weights\cite{Hopfield, Dubreil2014, Zunic2004}. Hence, network counting can give hints with respect to the computational power of neural systems. Techniques of network enumeration can also be used to investigate the frequency of motif occurrences in neural networks \cite{Itzhack2007} which can help to determine the local impact of global rules \cite{Vasiliki2014}. The number of functions that can be computed by a class of automata is also directly related to the size of this class. Since automata not satisfying the conditions given in the present paper would contain cells which behaviour would be irrelevant to the global behaviour of the automaton, we hope that the enumeration techniques here developed will eventually lead to a better understanding of the descriptive power of given classes of finite automata.
A first implication of the results of this paper is that, given the rate of increase of $N$ as a function of the number of nodes, the naive idea of unambiguously inferring the structure of a hidden network from input-output relationships is unrealistic under physiological data. Our results can also be used to derive bounds on the quantity of information needed to derive the network structure. Namely, the number of bits necessary to infer the network structure is at least
$$
\frac{\log(N(m,n_1,n_2,k))}{\log(2)}.
$$
Finally, we not only counted the admissible networks but also provided an efficient algorithm to generate these networks. In future works, this could be used to perform extensive searches over all admissible networks or to randomly sample networks over the set of all possibilities.
\begin{thebibliography}{99}
\bibitem{Adelio2002}
A. R. Matamala, P\'{o}lya's combinatorial method and the isomer
enumeration problem, {\it Bol. Soc. Chil. Quim.} \textbf{47} (2002),
105--112.
\bibitem{Ahlswede2000}
R. Ahlswede, N. Cai, S. Y. R. Li, and R. W. Yeung, Network information
flow, {\it IEEE Trans. Info. Theory }{\bf 46} (2000),
1204--1216.
\bibitem{Alon2008} N. Alon, P. Dao, I. Hajirasouliha, F. Hormozdiari,
and S. C. Sahinalp, Biomolecular network motif counting and discovery
by color coding, {\it Bioinformatics} {\bf 24} (2008), 241--249.
\bibitem{Andrews1988}
G. E. Andrews and F. G. Garvan, Dyson's crank of a partition, {\it
Bull. Amer. Math. Soc.} {\bf 18} (1988), 167--171.
\bibitem{Bender1983} E. A. Bender and E. R. Canfield, Enumeration of
connected invariant graphs, {\it J. Comb. Theory B} {\bf 34} (1983),
268--278.
\bibitem{Birsdorff2008} R. Birsdorff and J. L. Marichal, Counting
non-isomorphic maximal independent sets of the \textit{n}-cycle graph,
{\it J. Integer Sequences} {\bf 11} (2008),
\href{https://cs.uwaterloo.ca/journals/JIS/VOL11/Marichal/marichal.html}{Article 08.5.7}.
\bibitem{Craven1980}
T. C. Craven, An application of P\'{o}lya's theory of counting to an
enumeration problem arising in quadratic form theory, {\it J. Comb.
Theory A} {\bf 29} (1980), 174--181.
\bibitem{Dubreil2014}
A. M. Dubreuil, Y. Amit, and N. Brunel, Memory capacity of networks
with stochastic binary synapse, \textit{PLoS Comput. Biol.} \textbf{10}
(2014), 1--15.
\bibitem{Erdos 1959}
P. Erd\H{o}s and A. Renyi, On random graphs I, \textit{Publ. Math.
Debrecen} (1959), 290--297.
\bibitem{Fuks} H. Fuks and K. Sullivan, Enumeration of number-conserving
cellular automata rules with two inputs, {\it J. of Cell. Autom.} {\bf
2} (2007), 141--148.
\bibitem{Grassberger} P. Grassberger, Some more exact enumeration
results for 1D cellular automata, {\it J. Phys. A-Math. Theor.} {\bf
20} (1987), 4029--404.
\bibitem{Harary1955} F. Harary, The number of linear, directed, rooted,
and connected graphs, {\it Trans. Amer. Math. Soc.} {\bf 78} (1955),
445--463.
\bibitem{Harary1957} F. Harary, The number of oriented graphs, {\it
Michigan Math. J.} {\bf 4} (1957), 221--224.
\bibitem{Harary1973} F. Harary and E. M. Palmer, {\it Graphical Enumeration},
Academic Press, 1973.
\bibitem{Hofstad2006} R. van der Hofstad and J. Spencer, Counting
connected graphs asymptotically. {\it Eur. J. Combin.}
{\bf 27} (2006), 1294--1320.
\bibitem{Hopfield}
J. J. Hopfield, Neural networks and physical systems with emergent
collective computational abilities, {\it Proc. Nat. Acad. Sci. USA}
\textbf{79} (1982), 2554--2558.
\bibitem{Itzhack2007}
R. Itzhack, Y. Mogilevski, and Y. Louzoun, An optimal algorithm for
counting network motifs, {\it Physica A: Stat. Mech. Appl.}
{\bf 381} (2007), 482--490.
\bibitem{Jungnickel}
D. Jungnickel, {\it Graphs, Networks, and Algorithms}, Springer, 2008.
\bibitem{Knopfmacher2002} J. Knopfmacher, Asymptotic isomer
enumeration, {\it J. Math. Chem.} {\bf 24} (1998), 61--69.
\bibitem{Lavertu2014} G. Lavertu, S. L. C\^ote, and Y. De
Koninck, Normal and abnormal coding of somatosensory stimuli
causing pain, {\it Brain} {\bf 137} (2014), 724--738.
\bibitem{Pittel2005} B. Pittel and N. C. Wormald, Counting connected
graphs inside-out, {\it J. Comb. Theory B} {\bf 93} (2005), 127--172.
\bibitem{Polya1937} G. P\'{o}lya, Combinatorial enumeration of groups,
graphs, and chemical compounds, {\it Acta Math.} {\bf 68} (1937),
145--254.
\bibitem{Semenovich} M. Semenovich, B. Gordana, and D. Crnkovic,
{\it Information and Computation: Essays on Scientific and Philosophical
Understanding of Foundations of Information and Computation}, World
Scientific Publishing, 2011.
\bibitem{Twining} C. J. Twining and P.-M. Binder, Enumeration of limit
cycles in noncylindrical cellular automata, {\it J. Stat. Phys.} {\bf
66} (1992), 385--401.
\bibitem{Vasiliki2014}
E. Vasilaki and M. Giugliano, Emergence of connectivity motifs in
networks of model neurons with short- and long-term plastic synapses,
{\it PLoS ONE} {\bf 9} (2014), 1--17.
\bibitem{Yan2012}
X. Yan, R. W. Yeung, and Z. Zhang, An implicit characterization of the
achievable rate region for acyclic multisource multisink network
coding, {\it IEEE Trans. Info. Theory} {\bf 58} (2012),
5625--5639.
\bibitem{Zunic2004}
J. Zunic, On encoding and enumerating threshold functions, {\it IEEE
Trans. Neural Networks} {\bf 15} (2004), 261--267.
\end{thebibliography}
\bigskip
\hrule
\bigskip
\noindent 2010 {\it Mathematics Subject Classification}:
Primary 05A15; Secondary 05A16, 20D60, 92B20.
\noindent \emph{Keywords: }
graphical enumeration, P\'{o}lya's enumeration theorem,
digraph enumeration, hidden neural networks, finite automata enumeration.
\bigskip
\hrule
\bigskip
\vspace*{+.1in}
\noindent
Received December 8 2015;
revised versions received March 3 2016; April 8 2016.
Published in {\it Journal of Integer Sequences}, May 10 2016.
\bigskip
\hrule
\bigskip
\noindent
Return to
\htmladdnormallink{Journal of Integer Sequences home page}{http://www.cs.uwaterloo.ca/journals/JIS/}.
\vskip .1in
\end{document}
*