A larger C slows Ganetespib the grade, where C is empirically determined to avoid early convergence. 3.1.1. Model Parameters
The hypernetwork has a data-driven structure. From the input data, the fixed dimensions of the data determine the structure of a network. Inside the network, several edge configurations can be applied. Edge configuration depends on both the order size of edges k and the combination of edge types. Outside the network, the learning procedure, such as the number of repeated encodings, can be modulated. To modulate the structure, the hyperedge configuration is essential. Generally, a k-hypergraph is composed of k uniform hyperedges, where the length of the hyperedges is assigned as k. If k is fixed, we find an optimal configuration by modulating the magnitude of k. If k is variable, a hypernetwork is built with the mixed properties of different order sizes. Another parameter of the hyperedges is the combinational type used to compose a hyperedge. One hyperedge includes the serially adjacent nodes in the data. However, the serial order of the data does not assure a close relationship among the data attributes. Furthermore, when knowledge of the causal relation of the attributes is absent, the serial order will influence
the encoded model inadequately. Hence, a way to combine edges from the attributes is important for building a memory model with high-order relationships. The last parameter that affects the structure of a hypernetwork is the repetition of data encoding
into the memory. After an instance is encoded once, what happens if the instance is encoded again? Repeated encodings are interpreted as the study duration in recognition memory [39–41]. For a single instance, multiple encodings can affect the performance and structure of the memory model. According to the durational study, a hypernetwork can be a dense or coarse network. Consequently, the parameters that influence the memory structure are the relation between attributes, the size of edge order, the combinational order of the edges, and the repetition of the encoding and retrieval. 3.1.2. Scalability The proposed hypernetwork stacks input data into memory as the data accumulates. For lifelong experience, the length of the incoming data is temporally Carfilzomib unlimited. Thus, our concern regarding the memory model is the capacity of the patterns covered. The main characteristic of the memory structure is reflecting on the partitioning and combining of the data. When we define the number of values of each contextual attribute as Ci, where i ranges from 1 to d (dimension of attributes), the possible combinations of instances are ∏i=1dCi. If we set the fixed order size, k, the possible combinations of edges are represented as follows: ∏i=1kCi+∏i=2k+1Ci+⋯+∏i=dk+d−1Ci=∑t=1d(∏i=tk+t−1Ci).