#################### dataset after `dataset_map_fn` #################### [{'input': '[TITLE]\nAction Inference by Maximising Evidence: Zero-Shot Imitation from Observation with World Models\n\n[ABSTRACT]\nUnlike most reinforcement learning agents which require an unrealistic amount of environment interactions to learn a new behaviour, humans excel at learning quickly by merely observing and imitating others. This ability highly depends on the fact that humans have a model of their own embodiment that allows them to infer the most likely actions that led to the observed behaviour. In this paper, we propose Action Inference by Maximising Evidence (AIME) to replicate this behaviour using world models. AIME consists of two distinct phases. In the first phase, the agent learns a world model from its past experience to understand its own body by maximising the evidence lower bound (ELBO). While in the second phase, the agent is given some observation-only demonstrations of an expert performing a novel task and tries to imitate the expert\'s behaviour. AIME achieves this by defining a policy as an inference model and maximising the evidence of the demonstration under the policy and world model. Our method is "zero-shot" in the sense that it does not require further interactions with the environment after given the demonstration. We empirically validate the zero-shot imitation performance of our method on the Walker of the DeepMind Control Suite and find it outperforms the state-of-the-art baselines. We also find AIME with image observations still matches the performance of the baseline observing the true low-dimensional state of the environment.\n\n[CAPTIONS]\nFigure 1: Overview of AIME algorithm. In phase 1, both observations and actions are provided by the embodiment dataset and the agent learns a variational world model to model the evidence of observations conditioned on the actions. Then the learnt model weights are frozen and transferred to phase 2. In phase 2, only the observations are provided by the demonstration dataset, so the agent needs to infer both states and actions. The action inference is achieved by the policy model which samples actions given a state. The grey lines indicate the world model parameters are frozen in phase 2. Both phases are optimised toward the same objective, i.e. the ELBO.\nFigure 2: Performances on Walker. Each column indicates one task and its associated demonstration dataset, while each row indicates the embodiment datasets used to train the model. The title of each figure is named according to D body -> D demo . Numbers are computed by averaging among 10 trials and then normalized to the percentage of the expert\'s performance. The last row and column are averaged over the corresponding task or dataset.\nFigure 3: Ablation of number of demonstrations on transferring from p2e dataset. The performance is shown as the percentage of expert performance over 3 seeds.\n\n[CONTENT]\nSection Title: INTRODUCTION\n INTRODUCTION In recent years, deep reinforcement learning (DRL) has enabled intelligent decision-making agents to thrive in multiple fields Mnih et al. (2015) ; Silver et al. (2016) ; Vinyals et al.; Choi et al. (2023) ; OpenAI et al. (2019) ; Ouyang et al. (2022) . However, one of the biggest issues of DRL is sample inefficiency. The dominant framework in DRL is learning from scratch Agarwal et al. (2022) . Thus, most algorithms require an incredible amount of interactions with the environment Mnih et al. (2015) ; Silver et al. (2016) ; Vinyals et al.. In contrast, cortical animals such as humans are able to quickly learn new tasks through just a few trial-and-error attempts, and can further accelerate their learning process by observing others. An important difference between biological learning and the DRL framework is that the former uses past experience for new tasks. When we try a novel task, we use previously learnt components, and generalise to solve the new problem efficiently. This process is augmented by imitation learning Iacoboni (2008) , which allows us to replicate similar behaviours without direct observation of the underlying muscle movements. If the DRL agents could similarly harness observational data, such as the abundant online video data, the sample efficiency may be dramatically improved Baker et al. (2022) . However, directly learning a model from observation-only sequences is insufficient for both biological and technical systems, since we can only observe the outcome of actions, but do not know the actions themselves. Without knowing these actions, the observation sequences are highly stochastic and multi- modal Babaeizadeh et al. (2018) . Trying to infer these unknown actions without prior knowledge of the world is difficult due to the problem of attributing which parts of the observations are influenced by the actions and which parts are governed by normal system evolution or noise. Therefore, in this work, we hypothesise that in order to make best use of observation-only sequences, an agent has to first understand the notion of an action. This can be achieved by learning a model from an agent\'s past experiences where both the actions and their consequences, i.e. observations, are available. Given such a learnt model which includes a causal model of actions and their effects, it becomes feasible for an agent to infer an action sequence leading to given observation-only data. In this work, we propose a novel algorithm, Action Inference by Maximising Evidence (AIME), to try to replicate the imitation ability of humans. The agent first learns a world model from its past experience by maximising the evidence of these experiences. After receiving some observation-only demonstrations of a novel task, the agent tries to mimic the demonstrator by finding an action sequence that makes the demonstration most likely under the learnt model. This procedure is shown in Figure 1 . Our contribution can be summarised as follows: • We propose AIME, a novel method for imitation from observation. AIME first learns a world model by maximising the evidence of its past experience, then consider the policy as an action inference model and imitates by maximising the evidence of demonstration. • We conduct experiments with a variety of datasets and tasks to demonstrate the superior performance of AIME compared with other state-of-the-art methods.\n\nSection Title: PROBLEM FORMULATION\n PROBLEM FORMULATION Consider an MDP problem defined by the tuple {S, A, T, R}, where S is the state space, A is the action space, T : S × A → S is the dynamic function and R : S → R is the reward function. A POMDP adds partial observability upon an MDP with two components: the observation space O and the emission function Ω : S → O. The six components of a POMDP can be categorised into three groups: S, A and T define the embodiment of our agent, O and Ω define the sensors of our agent and R itself defines the task. The goal is to find a policy π : S → A which maximises the accumulated reward, i.e. t r t . In this paper, we want to study imitation learning within a fixed embodiment across different tasks. We presume the existence of two datasets for the same embodiment: • Embodiment dataset D body contains trajectories {o 0 , a 0 , o 1 , a 1 . . . } that represent past experiences of interacting with the environment. This dataset provides information about the embodiment for the algorithm to learn a model. For example, in this paper, the dataset is a replay buffer filled while solving some tasks with the same embodiment. But in general, it may be any collection of past experiences of the embodiment. • Demonstration dataset D demo contains a few expert trajectories {o 0 , o 1 , o 2 . . . } of the embodiment solving a certain task defined by R demo . The crucial difference between this dataset and the embodiment dataset is that the actions are not provided anymore since they are not observable from a third-person perspective. The goal of our agent is to use information in D body to learn a policy π from D demo which can solve the task defined by R demo as well as by the expert who generated D demo . For the sake of simplicity, we assume that the two datasets share the same observation space O and the emission model Ω.\n\nSection Title: METHODOLOGY\n METHODOLOGY In this section, we describe our proposed method, AIME, in detail. AIME consists of two phases. In the first phase, the knowledge of the embodiment is learnt through a form of world model; while in the second phase, this knowledge is used to imitate the expert.\n\nSection Title: PHASE 1: MODEL LEARNING\n PHASE 1: MODEL LEARNING In the first phase, we need to learn a model to understand our embodiment. We choose to achieve this by learning a world model. As an analogy to a language model, we formally define a world model as a probability distribution over sequences of observations. The model can be either unconditioned or conditioned on other factors such as previous observations or actions. For phase 1, the model needs to be the conditional distribution, i.e. p(o 1:T |a 0:T −1 ), to model the effect of the actions. When given a certain observation sequence, the likelihood of this sequence under the model is usually referred to as evidence. In this paper, we consider variational world models where the observation is governed by a Markovian hidden state. In the literature, this type of model is also referred to as a state-space model (SSM). Such a variational world model involves four components, namely f ϕ (o t ) is the encoder to extract the features from the observation; q ϕ (s t |s t−1 , a t−1 , z t ) and p θ (s t |s t−1 , a t−1 ) are the posterior and the prior of the latent state variable; while p θ (o t |s t ) is the decoder that decodes the observation distribution from the state. ϕ and θ represent the parameters of the inference model and the generation model respectively. Typically, a variational world model is trained by maximising the ELBO which is a lower bound of the log-likelihood, or evidence, of the observation sequence, i.e. log p θ (o 1:T |a 0:T −1 ). Given a sequence of observations, actions, and states, the objective function can be computed as The objective function is composed of two terms: the first term J rec is the likelihood of the observation under the inferred state, which is usually called the reconstruction loss; while the second term J KL is the KL divergence between the posterior and the prior distributions of the latent state. To compute the Reincarnating Reinforcement Learning Workshop at ICLR 2023 objective function, we need to sample the inferred states from the observation and action sequence. This is done by autoregressively sampling from the posterior with the re-parameterisation trick Kingma & Welling (2014) ; Rezende et al. (2014) . Combining all these, we formally define the optimisation problem for this phase as In this work, we use a specific variational world model called RSSM Hafner et al. (2019 ; 2020 ), which offers state-of-the-art performances by splitting the latent state to be a combination of deterministic and stochastic components.\n\nSection Title: PHASE 2: IMITATION LEARNING\n PHASE 2: IMITATION LEARNING In the second phase, we want to utilise the knowledge of the world model from the first phase to imitate the expert behaviour from the demonstration dataset D demo in which only sequences of observations but no actions are available. We will derive our algorithm from two different perspectives.\n\nSection Title: The Bayesian derivation\n The Bayesian derivation Since the actions are unknown in the demonstration, instead of modelling the conditional evidence in phase 1, we need to model the unconditional evidence, i.e. log p θ (o 1:T ). Thus, we also need to model the actions as latent variables together with the states. In this way, the reconstruction term J rec will stay the same as eq. (2), while the KL term will be defined on the joint distribution of states and actions, i.e. If we choose the action inference model in the form of a policy, i.e. π ψ (a t |s t ), and share it in both posterior and prior, then the new posterior and prior can be factorised as respectively. When we plug them into the eq. (5), the policy term cancels and we will get a similar optimisation problem with phase 1 as The main difference between eq. (4) and eq. (8) is where the action sequence is coming from. In phase 1, the action sequence is coming from the embodiment dataset, while in phase 2, it is sampled from the policy instead since it is not available in the demonstration dataset. The control derivation From another perspective, we can view phase 2 as a control problem. One crucial observation is that, as shown in eq. (1), given a trained world model, we can evaluate the lower bound of the evidence of any observation sequence given an associated action sequence as the condition. In a deterministic environment where the inverse dynamics model is injective, the true action sequence that leads to the observation sequence is the most likely under the true model. In general, the true action sequence may not necessarily be the most likely under the model. This is, however, a potential benefit of our approach. We are mainly interested in mimicking the expert\'s demonstration and may be better able to do so with a different action sequence. Thus, for each observation sequence that we get from the demonstration dataset, finding the missing action sequence can be considered as a trajectory-tracking problem and can be tackled by planning. To be specific, we can find the missing action sequence by solving the optimisation problem If we solve the above optimisation problem for every sequence in the demonstration dataset, the problem will be converted to a normal imitation learning problem and can be tackled with standard techniques such as behavioural cloning. We can also view this as forming an implicit inverse dynamics model (IDM) by inverting a forward model. Compute objective function J from eq. (1) Update policy parameters ψ ← ψ + α∇ ψ J To make it more efficient, we use amortised inference. We directly define a policy π ψ (a t |s t ) under the latent state of the world model. By composing the learnt world model and the policy, we can form a new generative model of the state sequence by the chain of s t → a t → s t+1 → a t+1 . . . → s T . Then we will get the same optimisation problem as eq. (8). To sum up, in AIME, we use the same objective function - the ELBO - in both phases with the only difference being the source of the action sequence. We provide the pseudo-code for the algorithm in Algorithm 1 with the colour highlighting the different origins of the actions between the two phases.\n\nSection Title: EXPERIMENTS\n EXPERIMENTS To test our method, we need multiple environments sharing an embodiment while posing different tasks. Therefore, we consider the embodiment Walker of the DeepMind Control Suite (DMC Suite) Tunyasuvunakool et al. (2020) , which poses three tasks: stand, walk and run. Following the common practise in the benchmark Hafner et al. (2020) , we repeat every action two times when interacting with the environment. For the Walker environment, the true state includes both the position and the velocity of each joint and the centre of mass of the body. In order to study the influence of different observation modalities, we consider three settings for each environment: MDP uses the true state as the observation; Visual uses images as the observation; LPOMDP uses only the position part of the state as the observation, so that information-wise it is identical to the Visual setting but the information is densely represented in a low-dimensional form. To generate the embodiment and demonstration datasets, we train a Dreamer Hafner et al. (2020) agent in the Visual setting for each of the tasks for 1M environment steps. Our Dreamer implementation can solve stand and walk quite well and efficiently. For the run task, the reward plateaus are around 600 but in the video, we can observe decent running behaviour. The exact performance of the demonstration dataset can be found at Table 1. We take the replay buffer of these trained agents as the embodiment datasets D body , which contain 1000 trajectories, and consider the converged policy as the expert to collect another 1000 trajectories as the demonstration dataset D demo . We only use 100 trajectories for the main experiments, and the left trajectories are used during ablation study. Besides the above embodiment datasets, we also study datasets generated by purely exploratory behaviour. To do this, we consider two settings. First, we use a random policy that samples uniformly from the action space to collect 1000 trajectories, and we call this a random dataset. Second, we train a Plan2Explore Sekar et al. (2020) agent for 1000 trajectories and label its replay buffer as the p2e dataset. Moreover, we also merge all the above datasets except the run dataset to form a mix dataset, Reincarnating Reinforcement Learning Workshop at ICLR 2023 which is close to a practical setting when one has a lot of experience with one embodiment and use also the data they collect to train a model.\n\nSection Title: BASELINES AND IMPLEMENTATION DETAILS\n BASELINES AND IMPLEMENTATION DETAILS We mainly compare our method with BCO Torabi et al. (2018a) , specifically the BCO(0) variant. BCO(0) first trains an IDM from the embodiment dataset and then used the trained IDM to label the demonstration dataset and then uses Behavioural Cloning (BC) to recover the policy. We do not compare with other methods since they either require further environment interactions Torabi et al. (2018b) ; Li et al. (2022) or use a goal-conditional setting Pathak et al. (2018) which does not suit the locomotion tasks. BCO(0) originally applies to the MDP setting. In order to adapt it to the POMDP setting, we use the common stacking trick Mnih et al. (2015) . In particular, we stack 5 consecutive observations to form a state for BCO. For the Visual setting, before stacking, the observation first passes through a shared CNN encoder to extract features. The structure of the CNN is implemented as in Ha & Schmidhuber; Hafner et al. (2019) . Both the IDM and the policy are implemented by an MLP with two hidden layers and 128 units for each layer. The activation functions are chosen to be ELU Clevert et al. (2015) for the hidden layers and Tanh for the output layer. Following the original paper, we split both datasets to 7 : 3 for the training and validation dataset. We train the models with the Adam Kingma & Ba (2014) optimiser for 50 epochs and then use the model with the best validation loss. To ensure that we are not artificially limiting the performance by computational resources, we also tried increasing both the width and depth of the MLP, but did not find any increase in performance. For our method, the RSSM implementation is largely following Dreamer-v1 Hafner et al. (2020) with continuous stochastic and deterministic variables. Although newer versions of Dreamer Hafner et al. (2021 ; 2023) offer some new tricks to improve performance, we choose not to use them for the sake of simplicity. We use a slightly larger state space for our experiment with 512 deterministic and 128 stochastic dimensions and find it generally eases the policy training process to collect the datasets. When decoding the image observation, we follow the same structure as Ha & Schmidhuber; Hafner et al. (2019) , while for the low-dimensional observations, the decoder is implemented as Gaussian distribution with both mean and variance being parameterised. Except for the deterministic part of the state using a GRU cell Chung et al. (2014) , all the other networks are MLPs with 2 hidden layers and 128 units of each layer. We do not use any free nats Hafner et al. (2020) , KL scaling Hafner et al. (2020) and KL balancing Hafner et al. (2021) tricks in the literature to relax the constraint of the KL term. When decoding low-dimensional signals, we sometimes observed the decoder yielding a degenerate solution as found in Seitzer et al. (2022) . We use their β-nll to remedy this problem, and since it re-weights the reconstruction term, we re-weight the KL term accordingly to maintain the balance. For phase 2 of AIME, we train the policy for 500 epochs and 100 gradient steps for each epoch. We report performance of the policy from the last epoch without any early stopping criteria. To maximise efficient usage of computational resources, we directly use the trained world model of the Dreamer agent when conducting our experiments, except for the Random dataset where we train the model from scratch in phase 1. An important finding is that, although loading pre-trained weights generally helps performance, the method does not perform well when we also transfer the weights of the learnt policy. We conjecture that it is due to learnt policies being stuck in some local minima that they are unable to escape.\n\nSection Title: RESULTS\n RESULTS The main result of our comparison is shown in Figure 2 . We can see that AIME largely outperforms BCO(0) in all the environment settings. AIME achieves the lowest performance on the Visual setting, but even that outperforms BCO(0)-MDP which can access the true states. We attribute the good performance to two reasons. First, the forward model has a better data utilisation rate than the inverse model because the forward model is trained to reconstruct whole observation sequences, while the inverse model only takes short clips of the sequence and only predicts the actions. Thus, the forward model has less chance to overfit and provides better generalisation. Second, by maximising the evidence, our method strives to find an action sequence that leads to the same outcome, not to recover the true actions. For many systems, the dynamics are not fully invertible. For example, if a human applies force to the wall, since the wall does not move, one cannot tell how much force is really Reincarnating Reinforcement Learning Workshop at ICLR 2023 applied by visual observation. The same situation applies to the Walker when certain joints are locked due to the singular pose. This same phenomenon is also discussed in Pathak et al. (2018) . How does the choice of dataset influence the results? First, as expected, for all the variants of methods, transferring within the same task is better than transferring between different tasks. In these settings, BCO(0)-MDP is comparable with AIME. However, AIME shines in cross-task transfer. Especially when transferring between run and walk tasks and transferring from stand to run, AIME outperforms the baselines by a large margin, which indicates the strong generalisability of a forward model over an inverse model. We also find that AIME makes substantially better use of exploratory data. AIME largely outperforms baselines when using the p2e dataset as the embodiment dataset and outperforms most variants when using the Random dataset as the embodiment dataset. Moreover, when transferring from the mix dataset, except for the MDP version, AIME outperforms other variants that train the world model on one of the mixed datasets individually. This showcases the scalability of a world model to be trained on a diverse set of experiences, which could be more valuable in real-world scenarios. How is AIME influenced by the observation modality? Compared with BCO(0), AIME is quite robust to the choice of observation modality, but it still makes a little difference. Although the observation provides the same information, we find AIME in the LPOMDP setting performs much better than in the Visual setting in all test cases. We attribute it to the fact that low-dimension signals have denser information and offer a smoother landscape in the evidence space than the pixels so that it can provide a more useful gradient to guide the action inference. Surprisingly, although having access to more information, AIME-MDP performs worse than AIME-LPOMDP on average. The biggest gap happens when transferring from p2e dataset. We conjecture this to the fact the RSSM world model of AIME-MDP is not trained well with the default hyper-parameters, but we defer further investigation to future work. How does the number of demonstrations influence the performance? We conduct the ab- lation study on p2e embodiment dataset where we vary the number of demonstrations within {1, 2, 5, 10, 20, 50, 100, 200, 500, 1000}. The result is shown in Figure 3 . BCO(0) struggles with low-data scenarios and typically needs at least 10 demonstrations to surpass its initial performance. In contrast, AIME demonstrates continual improvement with as few as 2 trajectories. This demonstrates the superior sample efficiency of our method. Moreover, the performance of AIME keeps increasing as more trajectories are provided beyond 100, which showcases the scalability of the method. We also notice that there is a dent in the performance curve at 5 demonstrations and we conjecture it is caused by bad demonstrations.\n\nSection Title: RELATED WORK\n RELATED WORK Imitation learning from observations Previous works on imitation learning from only observation can be roughly categorised into two groups, one based on IDMs Torabi et al. (2018a) ; Baker et al. (2022) ; Pathak et al. (2018) and one based on generative adversarial imitation learning (GAIL) Ho & Ermon (2016) ; Torabi et al. (2018b) ; Li et al. (2022) . The core component of the first group is to learn an IDM that maps a state transition pair to the action that caused the transition. Torabi et al. (2018a) ; Baker et al. (2022) use the IDM to label the expert\'s observation sequences, then solve the imitation learning problem with standard BC. Pathak et al. (2018) extends the IDM to a goal-conditioned setting and communicates the task with keyframes of the goal trajectory. Different from these methods, our approach uses a forward model to capture the knowledge of the embodiment. In the second group of approaches, the core component is a discriminator that distinguishes the demonstrator\'s and the agent\'s observation trajectories. Then the discriminator serves as a reward function, and the agent\'s policy is trained by RL Ho & Ermon (2016) . As a drawback, in order to train this discriminator the agent has to constantly interact with the environment to produce negative samples. Different from these methods, our method does not require further interactions with the environment, enabling zero-shot imitation from the demonstration dataset.\n\nSection Title: Reusing learnt components in decision-making\n Reusing learnt components in decision-making Although transferring pre-trained models has become a dominant approach in natural language processing (NLP) Devlin et al. (2019) ; Radford et al. (2019) ; Bommasani et al. (2021) and has been getting more popular in computer vision (CV) He et al. (2022) ; Bommasani et al. (2021) , reusing learnt components is less studied in the field of decision-making Agarwal et al. (2022) . Most existing works focus on transferring policies Finn et al. (2017) ; Baker et al. (2022) ; Agarwal et al. (2022) . On the other hand, the world model, a type of powerful perception model, that is purely trained by self-supervised learning lies behind the recent progress of model-based reinforcement learning Ha & Schmidhuber; Hafner et al. (2019 ; 2020 ; 2021; 2023); Łukasz Kaiser et al. (2020) ; Schrittwieser et al. (2020) . However, the transferability of these Reincarnating Reinforcement Learning Workshop at ICLR 2023 world models is not well-studied. Sekar et al. (2020) learns a policy by using a pre-trained world model from exploration data and demonstrates superior zero-shot and few-shot performance. We improve upon this direction by studying a different setting, i.e. imitation learning. In particular, we communicate the task to the model by observing the expert while Sekar et al. (2020) communicates the task by a ground truth reward function which is less accessible in a real-world setting.\n\nSection Title: DISCUSSION & CONCLUSION\n DISCUSSION & CONCLUSION In this paper, we present AIME, a model-based method for imitation from observations. The core of the method exploits the power of a pre-trained world model and inverses it w.r.t. action input by taking the gradients. On the Walker embodiment from the DMC Suite, we demonstrate superior performance compared to baselines, even when some baselines can access the true state. Although AIME performs well, there are still many limitations. First, humans mostly observe others with vision. Although AIME works quite well in the Visual setting, there is still a large gap compared with the LPOMDP setting where the low-level signals are observed. We attribute this to the fact that the loss surface of the pixel reconstruction loss may not be smooth enough to allow the gradient method to find an equally good solution. Second, in this paper, we only study the most simple setting where both the embodiment and sensor layout are fixed across tasks. On the other hand, humans observe others in a third-person perspective and can also imitate animals whose body is not even similar to humans\'. Relaxing these assumptions will open up possibilities to transfer across different embodiments and even directly from human videos. Third, for some tasks, even humans cannot achieve zero-shot imitation by only watching others. This may due to the task\'s complexity or completely unfamiliar skills. So, even with proper instruction, humans still need to practise in the environment to solve some tasks. This motivates an online learning phase 3 as an extension to our framework. We defer these topics to future work.\n', 'output': '[Significance and novelty]\n The AIME framework presents an imitation learning method that learns a world model from past experience, and then uses that model to infer actions from an observation-only demonstration, showcasing a unique approach to reinforcement learning.\n The paper discusses interesting similarities between AIME and transfer learning techniques in sequence modeling research, which may present a novel perspective for reinforcement learning.\n\n[Potential reasons for acceptance]\n The paper meticulously documents the choice of baselines, experimental setup, and testing on multiple representative environments, providing strong evidence of the generalizability and efficacy of the AIME framework.\n The paper is presented in a simple and understandable manner, making it accessible to a wide audience.\n\n[Potential reasons for rejection]\n The benefits of AIME over existing methods such as BCO and model-based reinforcement learning approaches are not explicitly laid out, and a clearer differentiation may be necessary for acceptance.\n The evaluation is noted to be on relatively simple tasks, and there is a suggestion to compare AIME to other algorithms in more complex settings.\n\n[Suggestions for improvement]\n The authors could take time to explicitly lay out the differences between AIME and existing methods like BCO and model-based reinforcement learning approaches.\n It would be beneficial to compare AIME to other algorithms in more complex settings to showcase its performance in extremely difficult learning environments.\n\n', 'system': "You are a professional machine learning conference reviewer who reviews a given paper and considers 4 criteria: [Significance and novelty], [Potential reasons for acceptance], [Potential reasons for rejection], and [Suggestions for improvement]. For each criterion, provide random number of supporting points derived from the paper's content. And for each supporting point, use the format: '' followed by a detailed explanation. Your response should only include your reviews only, which means always start with [Significance and novelty], dont' repeat the given paper and output things other than your reviews in required format. The paper is given as follows:"}] Map: 0%| | 0/10 [00:00<?, ? examples/s] Map: 100%|██████████| 10/10 [00:00<00:00, 707.96 examples/s] #################### dataset after adding templates #################### [{'input': '[INST] You are a professional machine learning conference reviewer who reviews a given paper and considers 4 criteria: [Significance and novelty], [Potential reasons for acceptance], [Potential reasons for rejection], and [Suggestions for improvement]. For each criterion, provide random number of supporting points derived from the paper\'s content. And for each supporting point, use the format: \'<title of supporting point>\' followed by a detailed explanation. Your response should only include your reviews only, which means always start with [Significance and novelty], dont\' repeat the given paper and output things other than your reviews in required format. The paper is given as follows: [/INST]\n[INST] [TITLE]\nAction Inference by Maximising Evidence: Zero-Shot Imitation from Observation with World Models\n\n[ABSTRACT]\nUnlike most reinforcement learning agents which require an unrealistic amount of environment interactions to learn a new behaviour, humans excel at learning quickly by merely observing and imitating others. This ability highly depends on the fact that humans have a model of their own embodiment that allows them to infer the most likely actions that led to the observed behaviour. In this paper, we propose Action Inference by Maximising Evidence (AIME) to replicate this behaviour using world models. AIME consists of two distinct phases. In the first phase, the agent learns a world model from its past experience to understand its own body by maximising the evidence lower bound (ELBO). While in the second phase, the agent is given some observation-only demonstrations of an expert performing a novel task and tries to imitate the expert\'s behaviour. AIME achieves this by defining a policy as an inference model and maximising the evidence of the demonstration under the policy and world model. Our method is "zero-shot" in the sense that it does not require further interactions with the environment after given the demonstration. We empirically validate the zero-shot imitation performance of our method on the Walker of the DeepMind Control Suite and find it outperforms the state-of-the-art baselines. We also find AIME with image observations still matches the performance of the baseline observing the true low-dimensional state of the environment.\n\n[CAPTIONS]\nFigure 1: Overview of AIME algorithm. In phase 1, both observations and actions are provided by the embodiment dataset and the agent learns a variational world model to model the evidence of observations conditioned on the actions. Then the learnt model weights are frozen and transferred to phase 2. In phase 2, only the observations are provided by the demonstration dataset, so the agent needs to infer both states and actions. The action inference is achieved by the policy model which samples actions given a state. The grey lines indicate the world model parameters are frozen in phase 2. Both phases are optimised toward the same objective, i.e. the ELBO.\nFigure 2: Performances on Walker. Each column indicates one task and its associated demonstration dataset, while each row indicates the embodiment datasets used to train the model. The title of each figure is named according to D body -> D demo . Numbers are computed by averaging among 10 trials and then normalized to the percentage of the expert\'s performance. The last row and column are averaged over the corresponding task or dataset.\nFigure 3: Ablation of number of demonstrations on transferring from p2e dataset. The performance is shown as the percentage of expert performance over 3 seeds.\n\n[CONTENT]\nSection Title: INTRODUCTION\n INTRODUCTION In recent years, deep reinforcement learning (DRL) has enabled intelligent decision-making agents to thrive in multiple fields Mnih et al. (2015) ; Silver et al. (2016) ; Vinyals et al.; Choi et al. (2023) ; OpenAI et al. (2019) ; Ouyang et al. (2022) . However, one of the biggest issues of DRL is sample inefficiency. The dominant framework in DRL is learning from scratch Agarwal et al. (2022) . Thus, most algorithms require an incredible amount of interactions with the environment Mnih et al. (2015) ; Silver et al. (2016) ; Vinyals et al.. In contrast, cortical animals such as humans are able to quickly learn new tasks through just a few trial-and-error attempts, and can further accelerate their learning process by observing others. An important difference between biological learning and the DRL framework is that the former uses past experience for new tasks. When we try a novel task, we use previously learnt components, and generalise to solve the new problem efficiently. This process is augmented by imitation learning Iacoboni (2008) , which allows us to replicate similar behaviours without direct observation of the underlying muscle movements. If the DRL agents could similarly harness observational data, such as the abundant online video data, the sample efficiency may be dramatically improved Baker et al. (2022) . However, directly learning a model from observation-only sequences is insufficient for both biological and technical systems, since we can only observe the outcome of actions, but do not know the actions themselves. Without knowing these actions, the observation sequences are highly stochastic and multi- modal Babaeizadeh et al. (2018) . Trying to infer these unknown actions without prior knowledge of the world is difficult due to the problem of attributing which parts of the observations are influenced by the actions and which parts are governed by normal system evolution or noise. Therefore, in this work, we hypothesise that in order to make best use of observation-only sequences, an agent has to first understand the notion of an action. This can be achieved by learning a model from an agent\'s past experiences where both the actions and their consequences, i.e. observations, are available. Given such a learnt model which includes a causal model of actions and their effects, it becomes feasible for an agent to infer an action sequence leading to given observation-only data. In this work, we propose a novel algorithm, Action Inference by Maximising Evidence (AIME), to try to replicate the imitation ability of humans. The agent first learns a world model from its past experience by maximising the evidence of these experiences. After receiving some observation-only demonstrations of a novel task, the agent tries to mimic the demonstrator by finding an action sequence that makes the demonstration most likely under the learnt model. This procedure is shown in Figure 1 . Our contribution can be summarised as follows: • We propose AIME, a novel method for imitation from observation. AIME first learns a world model by maximising the evidence of its past experience, then consider the policy as an action inference model and imitates by maximising the evidence of demonstration. • We conduct experiments with a variety of datasets and tasks to demonstrate the superior performance of AIME compared with other state-of-the-art methods.\n\nSection Title: PROBLEM FORMULATION\n PROBLEM FORMULATION Consider an MDP problem defined by the tuple {S, A, T, R}, where S is the state space, A is the action space, T : S × A → S is the dynamic function and R : S → R is the reward function. A POMDP adds partial observability upon an MDP with two components: the observation space O and the emission function Ω : S → O. The six components of a POMDP can be categorised into three groups: S, A and T define the embodiment of our agent, O and Ω define the sensors of our agent and R itself defines the task. The goal is to find a policy π : S → A which maximises the accumulated reward, i.e. t r t . In this paper, we want to study imitation learning within a fixed embodiment across different tasks. We presume the existence of two datasets for the same embodiment: • Embodiment dataset D body contains trajectories {o 0 , a 0 , o 1 , a 1 . . . } that represent past experiences of interacting with the environment. This dataset provides information about the embodiment for the algorithm to learn a model. For example, in this paper, the dataset is a replay buffer filled while solving some tasks with the same embodiment. But in general, it may be any collection of past experiences of the embodiment. • Demonstration dataset D demo contains a few expert trajectories {o 0 , o 1 , o 2 . . . } of the embodiment solving a certain task defined by R demo . The crucial difference between this dataset and the embodiment dataset is that the actions are not provided anymore since they are not observable from a third-person perspective. The goal of our agent is to use information in D body to learn a policy π from D demo which can solve the task defined by R demo as well as by the expert who generated D demo . For the sake of simplicity, we assume that the two datasets share the same observation space O and the emission model Ω.\n\nSection Title: METHODOLOGY\n METHODOLOGY In this section, we describe our proposed method, AIME, in detail. AIME consists of two phases. In the first phase, the knowledge of the embodiment is learnt through a form of world model; while in the second phase, this knowledge is used to imitate the expert.\n\nSection Title: PHASE 1: MODEL LEARNING\n PHASE 1: MODEL LEARNING In the first phase, we need to learn a model to understand our embodiment. We choose to achieve this by learning a world model. As an analogy to a language model, we formally define a world model as a probability distribution over sequences of observations. The model can be either unconditioned or conditioned on other factors such as previous observations or actions. For phase 1, the model needs to be the conditional distribution, i.e. p(o 1:T |a 0:T −1 ), to model the effect of the actions. When given a certain observation sequence, the likelihood of this sequence under the model is usually referred to as evidence. In this paper, we consider variational world models where the observation is governed by a Markovian hidden state. In the literature, this type of model is also referred to as a state-space model (SSM). Such a variational world model involves four components, namely f ϕ (o t ) is the encoder to extract the features from the observation; q ϕ (s t |s t−1 , a t−1 , z t ) and p θ (s t |s t−1 , a t−1 ) are the posterior and the prior of the latent state variable; while p θ (o t |s t ) is the decoder that decodes the observation distribution from the state. ϕ and θ represent the parameters of the inference model and the generation model respectively. Typically, a variational world model is trained by maximising the ELBO which is a lower bound of the log-likelihood, or evidence, of the observation sequence, i.e. log p θ (o 1:T |a 0:T −1 ). Given a sequence of observations, actions, and states, the objective function can be computed as The objective function is composed of two terms: the first term J rec is the likelihood of the observation under the inferred state, which is usually called the reconstruction loss; while the second term J KL is the KL divergence between the posterior and the prior distributions of the latent state. To compute the Reincarnating Reinforcement Learning Workshop at ICLR 2023 objective function, we need to sample the inferred states from the observation and action sequence. This is done by autoregressively sampling from the posterior with the re-parameterisation trick Kingma & Welling (2014) ; Rezende et al. (2014) . Combining all these, we formally define the optimisation problem for this phase as In this work, we use a specific variational world model called RSSM Hafner et al. (2019 ; 2020 ), which offers state-of-the-art performances by splitting the latent state to be a combination of deterministic and stochastic components.\n\nSection Title: PHASE 2: IMITATION LEARNING\n PHASE 2: IMITATION LEARNING In the second phase, we want to utilise the knowledge of the world model from the first phase to imitate the expert behaviour from the demonstration dataset D demo in which only sequences of observations but no actions are available. We will derive our algorithm from two different perspectives.\n\nSection Title: The Bayesian derivation\n The Bayesian derivation Since the actions are unknown in the demonstration, instead of modelling the conditional evidence in phase 1, we need to model the unconditional evidence, i.e. log p θ (o 1:T ). Thus, we also need to model the actions as latent variables together with the states. In this way, the reconstruction term J rec will stay the same as eq. (2), while the KL term will be defined on the joint distribution of states and actions, i.e. If we choose the action inference model in the form of a policy, i.e. π ψ (a t |s t ), and share it in both posterior and prior, then the new posterior and prior can be factorised as respectively. When we plug them into the eq. (5), the policy term cancels and we will get a similar optimisation problem with phase 1 as The main difference between eq. (4) and eq. (8) is where the action sequence is coming from. In phase 1, the action sequence is coming from the embodiment dataset, while in phase 2, it is sampled from the policy instead since it is not available in the demonstration dataset. The control derivation From another perspective, we can view phase 2 as a control problem. One crucial observation is that, as shown in eq. (1), given a trained world model, we can evaluate the lower bound of the evidence of any observation sequence given an associated action sequence as the condition. In a deterministic environment where the inverse dynamics model is injective, the true action sequence that leads to the observation sequence is the most likely under the true model. In general, the true action sequence may not necessarily be the most likely under the model. This is, however, a potential benefit of our approach. We are mainly interested in mimicking the expert\'s demonstration and may be better able to do so with a different action sequence. Thus, for each observation sequence that we get from the demonstration dataset, finding the missing action sequence can be considered as a trajectory-tracking problem and can be tackled by planning. To be specific, we can find the missing action sequence by solving the optimisation problem If we solve the above optimisation problem for every sequence in the demonstration dataset, the problem will be converted to a normal imitation learning problem and can be tackled with standard techniques such as behavioural cloning. We can also view this as forming an implicit inverse dynamics model (IDM) by inverting a forward model. Compute objective function J from eq. (1) Update policy parameters ψ ← ψ + α∇ ψ J To make it more efficient, we use amortised inference. We directly define a policy π ψ (a t |s t ) under the latent state of the world model. By composing the learnt world model and the policy, we can form a new generative model of the state sequence by the chain of s t → a t → s t+1 → a t+1 . . . → s T . Then we will get the same optimisation problem as eq. (8). To sum up, in AIME, we use the same objective function - the ELBO - in both phases with the only difference being the source of the action sequence. We provide the pseudo-code for the algorithm in Algorithm 1 with the colour highlighting the different origins of the actions between the two phases.\n\nSection Title: EXPERIMENTS\n EXPERIMENTS To test our method, we need multiple environments sharing an embodiment while posing different tasks. Therefore, we consider the embodiment Walker of the DeepMind Control Suite (DMC Suite) Tunyasuvunakool et al. (2020) , which poses three tasks: stand, walk and run. Following the common practise in the benchmark Hafner et al. (2020) , we repeat every action two times when interacting with the environment. For the Walker environment, the true state includes both the position and the velocity of each joint and the centre of mass of the body. In order to study the influence of different observation modalities, we consider three settings for each environment: MDP uses the true state as the observation; Visual uses images as the observation; LPOMDP uses only the position part of the state as the observation, so that information-wise it is identical to the Visual setting but the information is densely represented in a low-dimensional form. To generate the embodiment and demonstration datasets, we train a Dreamer Hafner et al. (2020) agent in the Visual setting for each of the tasks for 1M environment steps. Our Dreamer implementation can solve stand and walk quite well and efficiently. For the run task, the reward plateaus are around 600 but in the video, we can observe decent running behaviour. The exact performance of the demonstration dataset can be found at Table 1. We take the replay buffer of these trained agents as the embodiment datasets D body , which contain 1000 trajectories, and consider the converged policy as the expert to collect another 1000 trajectories as the demonstration dataset D demo . We only use 100 trajectories for the main experiments, and the left trajectories are used during ablation study. Besides the above embodiment datasets, we also study datasets generated by purely exploratory behaviour. To do this, we consider two settings. First, we use a random policy that samples uniformly from the action space to collect 1000 trajectories, and we call this a random dataset. Second, we train a Plan2Explore Sekar et al. (2020) agent for 1000 trajectories and label its replay buffer as the p2e dataset. Moreover, we also merge all the above datasets except the run dataset to form a mix dataset, Reincarnating Reinforcement Learning Workshop at ICLR 2023 which is close to a practical setting when one has a lot of experience with one embodiment and use also the data they collect to train a model.\n\nSection Title: BASELINES AND IMPLEMENTATION DETAILS\n BASELINES AND IMPLEMENTATION DETAILS We mainly compare our method with BCO Torabi et al. (2018a) , specifically the BCO(0) variant. BCO(0) first trains an IDM from the embodiment dataset and then used the trained IDM to label the demonstration dataset and then uses Behavioural Cloning (BC) to recover the policy. We do not compare with other methods since they either require further environment interactions Torabi et al. (2018b) ; Li et al. (2022) or use a goal-conditional setting Pathak et al. (2018) which does not suit the locomotion tasks. BCO(0) originally applies to the MDP setting. In order to adapt it to the POMDP setting, we use the common stacking trick Mnih et al. (2015) . In particular, we stack 5 consecutive observations to form a state for BCO. For the Visual setting, before stacking, the observation first passes through a shared CNN encoder to extract features. The structure of the CNN is implemented as in Ha & Schmidhuber; Hafner et al. (2019) . Both the IDM and the policy are implemented by an MLP with two hidden layers and 128 units for each layer. The activation functions are chosen to be ELU Clevert et al. (2015) for the hidden layers and Tanh for the output layer. Following the original paper, we split both datasets to 7 : 3 for the training and validation dataset. We train the models with the Adam Kingma & Ba (2014) optimiser for 50 epochs and then use the model with the best validation loss. To ensure that we are not artificially limiting the performance by computational resources, we also tried increasing both the width and depth of the MLP, but did not find any increase in performance. For our method, the RSSM implementation is largely following Dreamer-v1 Hafner et al. (2020) with continuous stochastic and deterministic variables. Although newer versions of Dreamer Hafner et al. (2021 ; 2023) offer some new tricks to improve performance, we choose not to use them for the sake of simplicity. We use a slightly larger state space for our experiment with 512 deterministic and 128 stochastic dimensions and find it generally eases the policy training process to collect the datasets. When decoding the image observation, we follow the same structure as Ha & Schmidhuber; Hafner et al. (2019) , while for the low-dimensional observations, the decoder is implemented as Gaussian distribution with both mean and variance being parameterised. Except for the deterministic part of the state using a GRU cell Chung et al. (2014) , all the other networks are MLPs with 2 hidden layers and 128 units of each layer. We do not use any free nats Hafner et al. (2020) , KL scaling Hafner et al. (2020) and KL balancing Hafner et al. (2021) tricks in the literature to relax the constraint of the KL term. When decoding low-dimensional signals, we sometimes observed the decoder yielding a degenerate solution as found in Seitzer et al. (2022) . We use their β-nll to remedy this problem, and since it re-weights the reconstruction term, we re-weight the KL term accordingly to maintain the balance. For phase 2 of AIME, we train the policy for 500 epochs and 100 gradient steps for each epoch. We report performance of the policy from the last epoch without any early stopping criteria. To maximise efficient usage of computational resources, we directly use the trained world model of the Dreamer agent when conducting our experiments, except for the Random dataset where we train the model from scratch in phase 1. An important finding is that, although loading pre-trained weights generally helps performance, the method does not perform well when we also transfer the weights of the learnt policy. We conjecture that it is due to learnt policies being stuck in some local minima that they are unable to escape.\n\nSection Title: RESULTS\n RESULTS The main result of our comparison is shown in Figure 2 . We can see that AIME largely outperforms BCO(0) in all the environment settings. AIME achieves the lowest performance on the Visual setting, but even that outperforms BCO(0)-MDP which can access the true states. We attribute the good performance to two reasons. First, the forward model has a better data utilisation rate than the inverse model because the forward model is trained to reconstruct whole observation sequences, while the inverse model only takes short clips of the sequence and only predicts the actions. Thus, the forward model has less chance to overfit and provides better generalisation. Second, by maximising the evidence, our method strives to find an action sequence that leads to the same outcome, not to recover the true actions. For many systems, the dynamics are not fully invertible. For example, if a human applies force to the wall, since the wall does not move, one cannot tell how much force is really Reincarnating Reinforcement Learning Workshop at ICLR 2023 applied by visual observation. The same situation applies to the Walker when certain joints are locked due to the singular pose. This same phenomenon is also discussed in Pathak et al. (2018) . How does the choice of dataset influence the results? First, as expected, for all the variants of methods, transferring within the same task is better than transferring between different tasks. In these settings, BCO(0)-MDP is comparable with AIME. However, AIME shines in cross-task transfer. Especially when transferring between run and walk tasks and transferring from stand to run, AIME outperforms the baselines by a large margin, which indicates the strong generalisability of a forward model over an inverse model. We also find that AIME makes substantially better use of exploratory data. AIME largely outperforms baselines when using the p2e dataset as the embodiment dataset and outperforms most variants when using the Random dataset as the embodiment dataset. Moreover, when transferring from the mix dataset, except for the MDP version, AIME outperforms other variants that train the world model on one of the mixed datasets individually. This showcases the scalability of a world model to be trained on a diverse set of experiences, which could be more valuable in real-world scenarios. How is AIME influenced by the observation modality? Compared with BCO(0), AIME is quite robust to the choice of observation modality, but it still makes a little difference. Although the observation provides the same information, we find AIME in the LPOMDP setting performs much better than in the Visual setting in all test cases. We attribute it to the fact that low-dimension signals have denser information and offer a smoother landscape in the evidence space than the pixels so that it can provide a more useful gradient to guide the action inference. Surprisingly, although having access to more information, AIME-MDP performs worse than AIME-LPOMDP on average. The biggest gap happens when transferring from p2e dataset. We conjecture this to the fact the RSSM world model of AIME-MDP is not trained well with the default hyper-parameters, but we defer further investigation to future work. How does the number of demonstrations influence the performance? We conduct the ab- lation study on p2e embodiment dataset where we vary the number of demonstrations within {1, 2, 5, 10, 20, 50, 100, 200, 500, 1000}. The result is shown in Figure 3 . BCO(0) struggles with low-data scenarios and typically needs at least 10 demonstrations to surpass its initial performance. In contrast, AIME demonstrates continual improvement with as few as 2 trajectories. This demonstrates the superior sample efficiency of our method. Moreover, the performance of AIME keeps increasing as more trajectories are provided beyond 100, which showcases the scalability of the method. We also notice that there is a dent in the performance curve at 5 demonstrations and we conjecture it is caused by bad demonstrations.\n\nSection Title: RELATED WORK\n RELATED WORK Imitation learning from observations Previous works on imitation learning from only observation can be roughly categorised into two groups, one based on IDMs Torabi et al. (2018a) ; Baker et al. (2022) ; Pathak et al. (2018) and one based on generative adversarial imitation learning (GAIL) Ho & Ermon (2016) ; Torabi et al. (2018b) ; Li et al. (2022) . The core component of the first group is to learn an IDM that maps a state transition pair to the action that caused the transition. Torabi et al. (2018a) ; Baker et al. (2022) use the IDM to label the expert\'s observation sequences, then solve the imitation learning problem with standard BC. Pathak et al. (2018) extends the IDM to a goal-conditioned setting and communicates the task with keyframes of the goal trajectory. Different from these methods, our approach uses a forward model to capture the knowledge of the embodiment. In the second group of approaches, the core component is a discriminator that distinguishes the demonstrator\'s and the agent\'s observation trajectories. Then the discriminator serves as a reward function, and the agent\'s policy is trained by RL Ho & Ermon (2016) . As a drawback, in order to train this discriminator the agent has to constantly interact with the environment to produce negative samples. Different from these methods, our method does not require further interactions with the environment, enabling zero-shot imitation from the demonstration dataset.\n\nSection Title: Reusing learnt components in decision-making\n Reusing learnt components in decision-making Although transferring pre-trained models has become a dominant approach in natural language processing (NLP) Devlin et al. (2019) ; Radford et al. (2019) ; Bommasani et al. (2021) and has been getting more popular in computer vision (CV) He et al. (2022) ; Bommasani et al. (2021) , reusing learnt components is less studied in the field of decision-making Agarwal et al. (2022) . Most existing works focus on transferring policies Finn et al. (2017) ; Baker et al. (2022) ; Agarwal et al. (2022) . On the other hand, the world model, a type of powerful perception model, that is purely trained by self-supervised learning lies behind the recent progress of model-based reinforcement learning Ha & Schmidhuber; Hafner et al. (2019 ; 2020 ; 2021; 2023); Łukasz Kaiser et al. (2020) ; Schrittwieser et al. (2020) . However, the transferability of these Reincarnating Reinforcement Learning Workshop at ICLR 2023 world models is not well-studied. Sekar et al. (2020) learns a policy by using a pre-trained world model from exploration data and demonstrates superior zero-shot and few-shot performance. We improve upon this direction by studying a different setting, i.e. imitation learning. In particular, we communicate the task to the model by observing the expert while Sekar et al. (2020) communicates the task by a ground truth reward function which is less accessible in a real-world setting.\n\nSection Title: DISCUSSION & CONCLUSION\n DISCUSSION & CONCLUSION In this paper, we present AIME, a model-based method for imitation from observations. The core of the method exploits the power of a pre-trained world model and inverses it w.r.t. action input by taking the gradients. On the Walker embodiment from the DMC Suite, we demonstrate superior performance compared to baselines, even when some baselines can access the true state. Although AIME performs well, there are still many limitations. First, humans mostly observe others with vision. Although AIME works quite well in the Visual setting, there is still a large gap compared with the LPOMDP setting where the low-level signals are observed. We attribute this to the fact that the loss surface of the pixel reconstruction loss may not be smooth enough to allow the gradient method to find an equally good solution. Second, in this paper, we only study the most simple setting where both the embodiment and sensor layout are fixed across tasks. On the other hand, humans observe others in a third-person perspective and can also imitate animals whose body is not even similar to humans\'. Relaxing these assumptions will open up possibilities to transfer across different embodiments and even directly from human videos. Third, for some tasks, even humans cannot achieve zero-shot imitation by only watching others. This may due to the task\'s complexity or completely unfamiliar skills. So, even with proper instruction, humans still need to practise in the environment to solve some tasks. This motivates an online learning phase 3 as an extension to our framework. We defer these topics to future work.\n [/INST]', 'need_eos_token': True, 'output': '[Significance and novelty]\n<Imitation learning method AIME> The AIME framework presents an imitation learning method that learns a world model from past experience, and then uses that model to infer actions from an observation-only demonstration, showcasing a unique approach to reinforcement learning.\n<Comparison to transfer learning techniques> The paper discusses interesting similarities between AIME and transfer learning techniques in sequence modeling research, which may present a novel perspective for reinforcement learning.\n\n[Potential reasons for acceptance]\n<Strong experimental setup and results> The paper meticulously documents the choice of baselines, experimental setup, and testing on multiple representative environments, providing strong evidence of the generalizability and efficacy of the AIME framework.\n<Concise and understandable presentation> The paper is presented in a simple and understandable manner, making it accessible to a wide audience.\n\n[Potential reasons for rejection]\n<Unclear differentiation from existing approaches> The benefits of AIME over existing methods such as BCO and model-based reinforcement learning approaches are not explicitly laid out, and a clearer differentiation may be necessary for acceptance.\n<Limited evaluation in simple settings> The evaluation is noted to be on relatively simple tasks, and there is a suggestion to compare AIME to other algorithms in more complex settings.\n\n[Suggestions for improvement]\n<Explicit differentiation from related approaches> The authors could take time to explicitly lay out the differences between AIME and existing methods like BCO and model-based reinforcement learning approaches.\n<Comparison in more complex settings> It would be beneficial to compare AIME to other algorithms in more complex settings to showcase its performance in extremely difficult learning environments.\n\n', 'sep': '\n', 'system': "You are a professional machine learning conference reviewer who reviews a given paper and considers 4 criteria: [Significance and novelty], [Potential reasons for acceptance], [Potential reasons for rejection], and [Suggestions for improvement]. For each criterion, provide random number of supporting points derived from the paper's content. And for each supporting point, use the format: '<title of supporting point>' followed by a detailed explanation. Your response should only include your reviews only, which means always start with [Significance and novelty], dont' repeat the given paper and output things other than your reviews in required format. The paper is given as follows:"}] Map: 0%| | 0/10 [00:00<?, ? examples/s] Map: 40%|████ | 4/10 [00:00<00:00, 25.70 examples/s] Map: 70%|███████ | 7/10 [00:00<00:00, 26.73 examples/s] Map: 100%|██████████| 10/10 [00:00<00:00, 27.97 examples/s] Map: 100%|██████████| 10/10 [00:00<00:00, 27.31 examples/s] #################### encoded input_ids #################### [1, 995, 460, 264, 5024, 5599, 5168, 9887, 4058, 263, 693, 10237, 264, 2078, 3830, 304, 28325, 28705, 28781, 15117, 28747, 733, 7384, 921, 617, 304, 7092, 884, 1181, 733, 28753, 322, 2256, 6494, 354, 19871, 1181, 733, 28753, 322, 2256, 6494, 354, 312, 7183, 1181, 304, 733, 28735, 16939, 594, 354, 14204, 1592, 1263, 1430, 3190, 263, 296, 28725, 3084, 5509, 1474, 302, 11787, 3569, 11236, 477, 272, 3830, 28742, 28713, 3036, 28723, 1015, 354, 1430, 11787, 1305, 28725, 938, 272, 5032, 28747, 9648, 3901, 302, 11787, 1305, 14117, 4961, 486, 264, 10537, 13268, 28723, 3604, 2899, 1023, 865, 3024, 574, 10237, 865, 28725, 690, 2825, 1743, 1149, 395, 733, 7384, 921, 617, 304, 7092, 884, 1181, 7286, 28742, 13750, 272, 2078, 3830, 304, 3825, 1722, 799, 821, 574, 10237, 297, 3030, 5032, 28723, 415, 3830, 349, 2078, 390, 6104, 28747, 733, 16289, 28793, 995, 460, 264, 5024, 5599, 5168, 9887, 4058, 263, 693, 10237, 264, 2078, 3830, 304, 28325, 28705, 28781, 15117, 28747, 733, 7384, 921, 617, 304, 7092, 884, 1181, 733, 28753, 322, 2256, 6494, 354, 19871, 1181, 733, 28753, 322, 2256, 6494, 354, 312, 7183, 1181, 304, 733, 28735, 16939, 594, 354, 14204, 1592, 1263, 1430, 3190, 263, 296, 28725, 3084, 5509, 1474, 302, 11787, 3569, 11236, 477, 272, 3830, 28742, 28713, 3036, 28723, 1015, 354, 1430, 11787, 1305, 28725, 938, 272, 5032, 28747, 9648, 3901, 302, 11787, 1305, 14117, 4961, 486, 264, 10537, 13268, 28723, 3604, 2899, 1023, 865, 3024, 574, 10237, 865, 28725, 690, 2825, 1743, 1149, 395, 733, 7384, 921, 617, 304, 7092, 884, 1181, 7286, 28742, 13750, 272, 2078, 3830, 304, 3825, 1722, 799, 821, 574, 10237, 297, 3030, 5032, 28723, 415, 3830, 349, 2078, 390, 6104, 28747, 733, 28748, 16289, 28793, 13, 28792, 16289, 28793, 733, 28738, 1153, 1180, 28793, 13, 3795, 560, 2103, 486, 5824, 321, 2979, 5740, 3164, 28747, 23697, 28733, 1981, 322, 315, 2162, 352, 477, 15951, 352, 395, 3304, 3813, 1190, 13, 13, 28792, 2192, 4572, 7637, 28793, 13, 1733, 4091, 1080, 25234, 466, 5168, 11767, 690, 2699, 396, 521, 6487, 3320, 3558, 302, 4466, 14983, 298, 2822, 264, 633, 14006, 28725, 10589, 2823, 301, 438, 5168, 4377, 486, 10783, 3977, 288, 304, 503, 23251, 2663, 28723, 851, 5537, 6416, 9618, 356, 272, 1639, 369, 10589, 506, 264, 2229, 302, 652, 1216, 21322, 2487, 369, 5976, 706, 298, 13701, 272, 1080, 3917, 6768, 369, 4163, 298, 272, 7975, 14006, 28723, 560, 456, 3830, 28725, 478, 19333, 9624, 560, 2103, 486, 5824, 321, 2979, 5740, 3164, 325, 11741, 1574, 28731, 298, 312, 13112, 456, 14006, 1413, 1526, 4994, 28723, 330, 14462, 12335, 302, 989, 9494, 23105, 28723, 560, 272, 907, 6896, 28725, 272, 8073, 2822, 28713, 264, 1526, 2229, 477, 871, 2609, 2659, 298, 2380, 871, 1216, 2187, 486, 15350, 2979, 272, 5566, 3889, 3585, 325, 2980, 5759, 609, 4023, 297, 272, 1676, 6896, 28725, 272, 8073, 349, 2078, 741, 15379, 28733, 5041, 6695, 697, 302, 396, 7583, 13801, 264, 7092, 3638, 304, 14744, 298, 503, 9647, 272, 7583, 28742, 28713, 14006, 28723, 330, 14462, 4510, 1855, 456, 486, 22629, 264, 4920, 390, 396, 297, 2103, 2229, 304, 15350, 2979, 272, 5566, 302, 272, 6695, 352, 916, 272, 4920, 304, 1526, 2229, 28723, 3489, 2038, 349, 345, 8981, 28733, 7063, 28739, 297, 272, 3367, 369, 378, 1235, 459, 2699, 3629, 14983, 395, 272, 4466, 1024, 2078, 272, 6695, 352, 28723, 816, 7911, 361, 1944, 13412, 272, 6129, 28733, 7063, 503, 5174, 4397, 302, 813, 2038, 356, 272, 17734, 302, 272, 14972, 28755, 507, 10336, 25660, 304, 1300, 378, 575, 487, 14367, 272, 1665, 28733, 1009, 28733, 1237, 28733, 444, 2388, 25465, 28723, 816, 835, 1300, 330, 14462, 395, 3469, 13875, 1309, 9019, 272, 4397, 302, 272, 26605, 3977, 288, 272, 1132, 2859, 28733, 13267, 1665, 302, 272, 4466, 28723, 13, 13, 28792, 13706, 10940, 28735, 28793, 13, 25665, 28705, 28740, 28747, 5235, 1578, 302, 330, 14462, 9464, 28723, 560, 6896, 28705, 28740, 28725, 1560, 13875, 304, 6768, 460, 3857, 486, 272, 21322, 2487, 13466, 304, 272, 8073, 2822, 28713, 264, 3090, 1249, 1526, 2229, 298, 2229, 272, 5566, 302, 13875, 4644, 286, 356, 272, 6768, 28723, 2479, 272, 27205, 448, 2229, 17628, 460, 15199, 304, 16266, 298, 6896, 28705, 28750, 28723, 560, 6896, 28705, 28750, 28725, 865, 272, 13875, 460, 3857, 486, 272, 6695, 352, 13466, 28725, 579, 272, 8073, 3208, 298, 13701, 1560, 4605, 304, 6768, 28723, 415, 2992, 297, 2103, 349, 12011, 486, 272, 4920, 2229, 690, 11714, 6768, 2078, 264, 1665, 28723, 415, 15543, 4715, 11634, 272, 1526, 2229, 5287, 460, 15199, 297, 6896, 28705, 28750, 28723, 7829, 23105, 460, 6997, 2458, 4112, 272, 1348, 13640, 28725, 613, 28723, 28706, 28723, 272, 22630, 5759, 28723, 13, 25665, 28705, 28750, 28747, 2744, 674, 2233, 356, 17734, 28723, 7066, 4419, 12825, 624, 3638, 304, 871, 5363, 6695, 352, 13466, 28725, 1312, 1430, 5150, 12825, 272, 21322, 2487, 26882, 1307, 298, 5835, 272, 2229, 28723, 415, 3941, 302, 1430, 5248, 349, 5160, 4771, 298, 384, 2187, 3193, 384, 23969, 842, 15755, 1891, 460, 16860, 486, 13363, 4054, 3352, 28705, 28740, 28734, 19993, 304, 868, 24814, 298, 272, 13822, 302, 272, 7583, 28742, 28713, 4397, 28723, 415, 1432, 5150, 304, 4419, 460, 13363, 2569, 754, 272, 7606, 3638, 442, 13466, 28723, 13, 25665, 28705, 28770, 28747, 330, 1231, 352, 302, 1474, 302, 6695, 697, 356, 5998, 699, 477, 284, 28750, 28706, 13466, 28723, 415, 4397, 349, 4894, 390, 272, 13822, 302, 7583, 4397, 754, 28705, 28770, 17544, 28723, 13, 13, 28792, 12162, 2431, 28793, 13, 11268, 13623, 28747, 13289, 1594, 11788, 5324, 13, 28705, 13289, 1594, 11788, 5324, 560, 5391, 1267, 28725, 3534, 25234, 466, 5168, 325, 4232, 28758, 28731, 659, 9651, 18926, 5161, 28733, 14727, 11767, 298, 306, 6016, 297, 5166, 5080, 28705, 351, 19836, 911, 389, 28723, 325, 28750, 28734, 28740, 28782, 28731, 2753, 28705, 14511, 911, 389, 28723, 325, 28750, 28734, 28740, 28784, 28731, 2753, 28705, 550, 4279, 973, 911, 389, 19362, 689, 3950, 911, 389, 28723, 325, 28750, 28734, 28750, 28770, 28731, 2753, 28705, 5629, 11741, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 28731, 2753, 28705, 451, 4533, 602, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 2993, 28725, 624, 302, 272, 8964, 4382, 302, 384, 5419, 349, 7324, 297, 6610, 9375, 28723, 415, 18669, 10782, 297, 384, 5419, 349, 5168, 477, 15147, 28705, 3786, 283, 18404, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 7110, 28725, 1080, 18539, 2699, 396, 13978, 3558, 302, 14983, 395, 272, 4466, 28705, 351, 19836, 911, 389, 28723, 325, 28750, 28734, 28740, 28782, 28731, 2753, 28705, 14511, 911, 389, 28723, 325, 28750, 28734, 28740, 28784, 28731, 2753, 550, 4279, 973, 911, 389, 568, 560, 9349, 28725, 16779, 745, 8222, 1259, 390, 10589, 460, 2358, 298, 4377, 2822, 633, 9796, 1059, 776, 264, 1664, 9070, 28733, 391, 28733, 1958, 13199, 28725, 304, 541, 3629, 13057, 380, 652, 5168, 1759, 486, 3977, 288, 2663, 28723, 1094, 2278, 5133, 1444, 21549, 5168, 304, 272, 384, 5419, 10782, 349, 369, 272, 4494, 6098, 2609, 2659, 354, 633, 9796, 28723, 1684, 478, 1464, 264, 7092, 3638, 28725, 478, 938, 8361, 27205, 448, 8076, 28725, 304, 2952, 864, 298, 12049, 272, 633, 2700, 23463, 28723, 851, 1759, 349, 20836, 286, 486, 503, 5174, 5168, 28705, 315, 323, 598, 7997, 325, 28750, 28734, 28734, 28783, 28731, 1200, 690, 5976, 592, 298, 312, 13112, 3684, 10205, 2020, 1671, 1863, 15379, 302, 272, 14164, 14540, 15071, 28723, 1047, 272, 384, 5419, 11767, 829, 21922, 295, 22164, 3977, 1249, 1178, 28725, 1259, 390, 272, 15116, 440, 3270, 3798, 1178, 28725, 272, 7324, 12832, 993, 347, 25797, 11957, 28705, 19803, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 2993, 28725, 5090, 5168, 264, 2229, 477, 15379, 28733, 5041, 22456, 349, 1488, 23885, 354, 1560, 21549, 304, 10067, 4918, 28725, 1854, 478, 541, 865, 14631, 272, 14120, 302, 6768, 28725, 562, 511, 459, 873, 272, 6768, 3892, 28723, 11528, 8215, 1167, 6768, 28725, 272, 15379, 22456, 460, 6416, 341, 4042, 3953, 304, 6079, 28733, 22946, 28705, 15726, 4643, 463, 770, 28716, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28731, 842, 320, 5125, 298, 13701, 1167, 9038, 6768, 1671, 4681, 4788, 302, 272, 1526, 349, 3796, 2940, 298, 272, 2700, 302, 19096, 288, 690, 5099, 302, 272, 13875, 460, 19927, 486, 272, 6768, 304, 690, 5099, 460, 22175, 486, 4123, 1587, 10195, 442, 8498, 28723, 8469, 28725, 297, 456, 771, 28725, 478, 19108, 864, 369, 297, 1745, 298, 1038, 1489, 938, 302, 15379, 28733, 5041, 22456, 28725, 396, 8073, 659, 298, 907, 2380, 272, 14086, 302, 396, 2992, 28723, 851, 541, 347, 12011, 486, 5168, 264, 2229, 477, 396, 8073, 28742, 28713, 2609, 9021, 970, 1560, 272, 6768, 304, 652, 13416, 28725, 613, 28723, 28706, 28723, 13875, 28725, 460, 2632, 28723, 12628, 1259, 264, 27205, 448, 2229, 690, 5532, 264, 3599, 282, 2229, 302, 6768, 304, 652, 6092, 28725, 378, 6755, 25953, 1070, 354, 396, 8073, 298, 13701, 396, 2992, 7768, 5374, 298, 2078, 15379, 28733, 5041, 1178, 28723, 560, 456, 771, 28725, 478, 19333, 264, 7092, 9464, 28725, 9624, 560, 2103, 486, 5824, 321, 2979, 5740, 3164, 325, 11741, 1574, 557, 298, 1464, 298, 312, 13112, 272, 503, 5174, 5537, 302, 10589, 28723, 415, 8073, 907, 2822, 28713, 264, 1526, 2229, 477, 871, 2609, 2659, 486, 15350, 2979, 272, 5566, 302, 1167, 9021, 28723, 2530, 11864, 741, 15379, 28733, 5041, 6695, 697, 302, 264, 7092, 3638, 28725, 272, 8073, 14744, 298, 26302, 294, 272, 6695, 1028, 486, 7484, 396, 2992, 7768, 369, 2870, 272, 6695, 352, 1080, 3917, 916, 272, 27205, 448, 2229, 28723, 851, 10722, 349, 4894, 297, 28705, 9240, 28705, 28740, 842, 3489, 11817, 541, 347, 18062, 2458, 390, 6104, 28747, 3700, 816, 19333, 330, 14462, 28725, 264, 7092, 2038, 354, 503, 5174, 477, 15379, 28723, 330, 14462, 907, 2822, 28713, 264, 1526, 2229, 486, 15350, 2979, 272, 5566, 302, 871, 2609, 2659, 28725, 868, 1917, 272, 4920, 390, 396, 2992, 297, 2103, 2229, 304, 503, 279, 1002, 486, 15350, 2979, 272, 5566, 302, 6695, 352, 28723, 3700, 816, 9086, 14107, 395, 264, 6677, 302, 26882, 304, 9796, 298, 16499, 272, 11352, 4397, 302, 330, 14462, 6731, 395, 799, 1665, 28733, 1009, 28733, 1237, 28733, 444, 5562, 28723, 13, 13, 11268, 13623, 28747, 7497, 28760, 1180, 28755, 6907, 25586, 4866, 13, 28705, 7497, 28760, 1180, 28755, 6907, 25586, 4866, 11772, 396, 351, 10485, 2700, 4057, 486, 272, 21000, 371, 28735, 28725, 330, 28725, 320, 28725, 399, 881, 970, 318, 349, 272, 1665, 2764, 28725, 330, 349, 272, 2992, 2764, 28725, 320, 714, 318, 15770, 330, 15672, 318, 349, 272, 10616, 908, 304, 399, 714, 318, 15672, 399, 349, 272, 11499, 908, 28723, 330, 367, 4341, 10485, 13633, 10473, 3977, 2437, 3714, 396, 351, 10485, 395, 989, 8076, 28747, 272, 15379, 2764, 451, 304, 272, 17067, 908, 28705, 229, 135, 169, 714, 318, 15672, 451, 28723, 415, 3522, 8076, 302, 264, 367, 4341, 10485, 541, 347, 20577, 2458, 778, 1712, 4938, 28747, 318, 28725, 330, 304, 320, 6600, 272, 21322, 2487, 302, 813, 8073, 28725, 451, 304, 28705, 229, 135, 169, 6600, 272, 27810, 302, 813, 8073, 304, 399, 3837, 18791, 272, 3638, 28723, 415, 5541, 349, 298, 1300, 264, 4920, 28705, 29030, 714, 318, 15672, 330, 690, 15350, 3900, 272, 14341, 6432, 11499, 28725, 613, 28723, 28706, 28723, 261, 408, 261, 842, 560, 456, 3830, 28725, 478, 947, 298, 3881, 503, 5174, 5168, 2373, 264, 6321, 21322, 2487, 2673, 1581, 9796, 28723, 816, 1258, 2150, 272, 9025, 302, 989, 26882, 354, 272, 1348, 21322, 2487, 28747, 3700, 18065, 350, 2487, 13466, 384, 2187, 5876, 21699, 2420, 371, 28709, 28705, 28734, 1200, 264, 28705, 28734, 1200, 289, 28705, 28740, 1200, 264, 28705, 28740, 842, 842, 842, 443, 369, 2904, 2609, 9021, 302, 14113, 288, 395, 272, 4466, 28723, 851, 13466, 5312, 1871, 684, 272, 21322, 2487, 354, 272, 9464, 298, 2822, 264, 2229, 28723, 1263, 2757, 28725, 297, 456, 3830, 28725, 272, 13466, 349, 264, 312, 1674, 5496, 6774, 1312, 22100, 741, 9796, 395, 272, 1348, 21322, 2487, 28723, 1092, 297, 2952, 28725, 378, 993, 347, 707, 5442, 302, 2609, 9021, 302, 272, 21322, 2487, 28723, 3700, 4267, 266, 1042, 352, 13466, 384, 23969, 5876, 264, 1664, 7583, 21699, 2420, 371, 28709, 28705, 28734, 1200, 289, 28705, 28740, 1200, 289, 28705, 28750, 842, 842, 842, 443, 302, 272, 21322, 2487, 22100, 264, 2552, 3638, 4057, 486, 399, 23969, 842, 415, 13040, 5133, 1444, 456, 13466, 304, 272, 21322, 2487, 13466, 349, 369, 272, 6768, 460, 459, 3857, 9229, 1854, 590, 460, 459, 3977, 522, 477, 264, 4008, 28733, 9701, 10403, 28723, 415, 5541, 302, 813, 8073, 349, 298, 938, 1871, 297, 384, 2187, 298, 2822, 264, 4920, 28705, 29030, 477, 384, 23969, 690, 541, 12049, 272, 3638, 4057, 486, 399, 23969, 390, 1162, 390, 486, 272, 7583, 693, 7138, 384, 23969, 842, 1263, 272, 14408, 302, 25233, 28725, 478, 7146, 369, 272, 989, 26882, 4098, 272, 1348, 15379, 2764, 451, 304, 272, 17067, 2229, 28705, 229, 135, 169, 28723, 13, 13, 11268, 13623, 28747, 351, 21138, 8267, 5305, 28802, 13, 28705, 351, 21138, 8267, 5305, 28802, 560, 456, 4211, 28725, 478, 6685, 813, 8278, 2038, 28725, 330, 14462, 28725, 297, 8291, 28723, 330, 14462, 12335, 302, 989, 23105, 28723, 560, 272, 907, 6896, 28725, 272, 4788, 302, 272, 21322, 2487, 349, 27205, 448, 1059, 264, 1221, 302, 1526, 2229, 28745, 1312, 297, 272, 1676, 6896, 28725, 456, 4788, 349, 1307, 298, 503, 9647, 272, 7583, 28723, 13, 13, 11268, 13623, 28747, 11063, 11257, 28705, 28740, 28747, 9875, 1336, 28758, 14181, 1087, 22021, 13, 28705, 11063, 11257, 28705, 28740, 28747, 9875, 1336, 28758, 14181, 1087, 22021, 560, 272, 907, 6896, 28725, 478, 927, 298, 2822, 264, 2229, 298, 2380, 813, 21322, 2487, 28723, 816, 4987, 298, 6619, 456, 486, 5168, 264, 1526, 2229, 28723, 1136, 396, 2880, 8677, 298, 264, 3842, 2229, 28725, 478, 25491, 6600, 264, 1526, 2229, 390, 264, 10966, 6023, 754, 22456, 302, 13875, 28723, 415, 2229, 541, 347, 2477, 521, 12058, 286, 442, 4644, 286, 356, 799, 8612, 1259, 390, 3454, 13875, 442, 6768, 28723, 1263, 6896, 28705, 28740, 28725, 272, 2229, 3208, 298, 347, 272, 2076, 2582, 6023, 28725, 613, 28723, 28706, 28723, 284, 28732, 28709, 28705, 28740, 28747, 28738, 342, 28708, 28705, 28734, 28747, 28738, 8798, 28740, 6223, 298, 2229, 272, 2030, 302, 272, 6768, 28723, 1684, 2078, 264, 2552, 15379, 7768, 28725, 272, 24277, 302, 456, 7768, 916, 272, 2229, 349, 4312, 11449, 298, 390, 5566, 28723, 560, 456, 3830, 28725, 478, 1917, 3090, 1249, 1526, 4994, 970, 272, 15379, 349, 22175, 486, 264, 3655, 626, 753, 7918, 1665, 28723, 560, 272, 11354, 28725, 456, 1212, 302, 2229, 349, 835, 11449, 298, 390, 264, 1665, 28733, 3603, 2229, 325, 1383, 28755, 609, 10373, 264, 3090, 1249, 1526, 2229, 14657, 2308, 8076, 28725, 19902, 285, 28705, 210, 152, 325, 28709, 261, 1143, 349, 272, 2524, 5722, 298, 9131, 272, 4190, 477, 272, 15379, 28745, 4256, 28705, 210, 152, 325, 28713, 261, 342, 28713, 261, 28923, 28740, 1200, 264, 261, 28923, 28740, 1200, 686, 261, 1143, 304, 284, 28705, 29255, 325, 28713, 261, 342, 28713, 261, 28923, 28740, 1200, 264, 261, 28923, 28740, 1143, 460, 272, 20862, 304, 272, 4681, 302, 272, 4039, 308, 1665, 7860, 28745, 1312, 284, 28705, 29255, 325, 28709, 261, 342, 28713, 261, 1143, 349, 272, 1285, 5722, 369, 1285, 2804, 272, 15379, 6023, 477, 272, 1665, 28723, 28705, 210, 152, 304, 28705, 29255, 2904, 272, 5287, 302, 272, 297, 2103, 2229, 304, 272, 8342, 2229, 8628, 28723, 15633, 1944, 28725, 264, 3090, 1249, 1526, 2229, 349, 10898, 486, 15350, 2979, 272, 22630, 5759, 690, 349, 264, 3889, 3585, 302, 272, 2085, 28733, 4634, 18538, 28725, 442, 5566, 28725, 302, 272, 15379, 7768, 28725, 613, 28723, 28706, 28723, 2085, 284, 28705, 29255, 325, 28709, 28705, 28740, 28747, 28738, 342, 28708, 28705, 28734, 28747, 28738, 8798, 28740, 11505, 12628, 264, 7768, 302, 13875, 28725, 6768, 28725, 304, 4605, 28725, 272, 13640, 908, 541, 347, 16860, 390, 415, 13640, 908, 349, 15021, 302, 989, 3471, 28747, 272, 907, 1850, 475, 937, 349, 272, 24277, 302, 272, 15379, 916, 272, 297, 9497, 1665, 28725, 690, 349, 4312, 1987, 272, 27419, 4320, 28745, 1312, 272, 1676, 1850, 475, 524, 28758, 349, 272, 524, 28758, 19002, 11986, 1444, 272, 20862, 304, 272, 4681, 20779, 302, 272, 4039, 308, 1665, 28723, 1791, 12881, 272, 1298, 1814, 1331, 1077, 24541, 5712, 466, 17504, 5066, 9876, 438, 315, 3100, 28754, 28705, 28750, 28734, 28750, 28770, 13640, 908, 28725, 478, 927, 298, 7324, 272, 297, 9497, 4605, 477, 272, 15379, 304, 2992, 7768, 28723, 851, 349, 2203, 486, 1206, 431, 2736, 2260, 23230, 477, 272, 20862, 395, 272, 312, 28733, 14490, 5463, 10777, 28705, 4041, 705, 567, 394, 3572, 325, 28750, 28734, 28740, 28781, 28731, 2753, 28705, 1298, 28764, 4539, 911, 389, 28723, 325, 28750, 28734, 28740, 28781, 28731, 842, 19422, 4038, 544, 1167, 28725, 478, 25491, 6600, 272, 6997, 5463, 2700, 354, 456, 6896, 390, 560, 456, 771, 28725, 478, 938, 264, 2948, 3090, 1249, 1526, 2229, 1987, 399, 1383, 28755, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 2753, 259, 28750, 28734, 28750, 28734, 6223, 690, 5751, 1665, 28733, 1009, 28733, 1237, 28733, 444, 17487, 486, 28578, 272, 4039, 308, 1665, 298, 347, 264, 9470, 302, 16710, 3320, 304, 341, 4042, 3953, 8076, 28723, 13, 13, 11268, 13623, 28747, 11063, 11257, 28705, 28750, 28747, 17593, 1153, 4866, 14181, 1087, 22021, 13, 28705, 11063, 11257, 28705, 28750, 28747, 17593, 1153, 4866, 14181, 1087, 22021, 560, 272, 1676, 6896, 28725, 478, 947, 298, 4479, 864, 272, 4788, 302, 272, 1526, 2229, 477, 272, 907, 6896, 298, 503, 9647, 272, 7583, 14006, 477, 272, 6695, 352, 13466, 384, 23969, 297, 690, 865, 22456, 302, 13875, 562, 708, 6768, 460, 2632, 28723, 816, 622, 24058, 813, 9464, 477, 989, 1581, 3708, 26859, 28723, 13, 13, 11268, 13623, 28747, 415, 6644, 17754, 12314, 352, 13, 28705, 415, 6644, 17754, 12314, 352, 4577, 272, 6768, 460, 9038, 297, 272, 6695, 352, 28725, 3519, 302, 968, 3572, 272, 2076, 2582, 5566, 297, 6896, 28705, 28740, 28725, 478, 927, 298, 2229, 272, 521, 1213, 2582, 5566, 28725, 613, 28723, 28706, 28723, 2085, 284, 28705, 29255, 325, 28709, 28705, 28740, 28747, 28738, 11505, 7110, 28725, 478, 835, 927, 298, 2229, 272, 6768, 390, 4039, 308, 8925, 2553, 395, 272, 4605, 28723, 560, 456, 1069, 28725, 272, 27419, 1850, 475, 937, 622, 3079, 272, 1348, 390, 12084, 28723, 325, 28750, 557, 1312, 272, 524, 28758, 1850, 622, 347, 4057, 356, 272, 9353, 6023, 302, 4605, 304, 6768, 28725, 613, 28723, 28706, 28723, 1047, 478, 4987, 272, 2992, 297, 2103, 2229, 297, 272, 1221, 302, 264, 4920, 28725, 613, 28723, 28706, 28723, 28705, 29030, 28705, 30341, 325, 28708, 261, 342, 28713, 261, 6223, 304, 4098, 378, 297, 1560, 20862, 304, 4681, 28725, 868, 272, 633, 20862, 304, 4681, 541, 347, 6999, 2458, 390, 8628, 28723, 1684, 478, 16605, 706, 778, 272, 12084, 28723, 325, 28782, 557, 272, 4920, 1850, 541, 28717, 1190, 304, 478, 622, 625, 264, 3684, 6997, 5463, 2700, 395, 6896, 28705, 28740, 390, 415, 2191, 5133, 1444, 12084, 28723, 325, 28781, 28731, 304, 12084, 28723, 325, 28783, 28731, 349, 970, 272, 2992, 7768, 349, 3524, 477, 28723, 560, 6896, 28705, 28740, 28725, 272, 2992, 7768, 349, 3524, 477, 272, 21322, 2487, 13466, 28725, 1312, 297, 6896, 28705, 28750, 28725, 378, 349, 4545, 9667, 477, 272, 4920, 3519, 1854, 378, 349, 459, 2632, 297, 272, 6695, 352, 13466, 28723, 415, 2602, 12314, 352, 3672, 1698, 10403, 28725, 478, 541, 2204, 6896, 28705, 28750, 390, 264, 2602, 2700, 28723, 2387, 13040, 15379, 349, 369, 28725, 390, 4894, 297, 12084, 28723, 325, 28740, 557, 2078, 264, 10898, 1526, 2229, 28725, 478, 541, 15627, 272, 3889, 3585, 302, 272, 5566, 302, 707, 15379, 7768, 2078, 396, 5363, 2992, 7768, 390, 272, 4644, 28723, 560, 264, 16710, 3320, 4466, 970, 272, 22668, 15562, 2229, 349, 16453, 495, 28725, 272, 1132, 2992, 7768, 369, 8681, 298, 272, 15379, 7768, 349, 272, 1080, 3917, 916, 272, 1132, 2229, 28723, 560, 2952, 28725, 272, 1132, 2992, 7768, 993, 459, 11867, 347, 272, 1080, 3917, 916, 272, 2229, 28723, 851, 349, 28725, 3545, 28725, 264, 4628, 7949, 302, 813, 4431, 28723, 816, 460, 11464, 6348, 297, 26302, 16402, 272, 7583, 28742, 28713, 6695, 352, 304, 993, 347, 1873, 2358, 298, 511, 579, 395, 264, 1581, 2992, 7768, 28723, 7110, 28725, 354, 1430, 15379, 7768, 369, 478, 625, 477, 272, 6695, 352, 13466, 28725, 7484, 272, 6925, 2992, 7768, 541, 347, 4525, 390, 264, 21699, 695, 28733, 7822, 288, 2700, 304, 541, 347, 13616, 1006, 486, 7394, 28723, 1791, 347, 2948, 28725, 478, 541, 1300, 272, 6925, 2992, 7768, 486, 22100, 272, 6997, 5463, 2700, 1047, 478, 12049, 272, 2747, 6997, 5463, 2700, 354, 1012, 7768, 297, 272, 6695, 352, 13466, 28725, 272, 2700, 622, 347, 15514, 298, 264, 4123, 503, 5174, 5168, 2700, 304, 541, 347, 13616, 1006, 395, 4787, 9804, 1259, 390, 14006, 282, 533, 20392, 28723, 816, 541, 835, 2204, 456, 390, 20345, 396, 21628, 22668, 15562, 2229, 325, 1138, 28755, 28731, 486, 297, 343, 1157, 264, 3814, 2229, 28723, 1124, 8145, 13640, 908, 475, 477, 12084, 28723, 325, 28740, 28731, 8980, 4920, 5287, 28705, 30341, 28705, 29787, 28705, 30341, 648, 28705, 28948, 229, 139, 138, 28705, 30341, 475, 1791, 1038, 378, 680, 9096, 28725, 478, 938, 837, 419, 2458, 297, 2103, 28723, 816, 5090, 6600, 264, 4920, 28705, 29030, 28705, 30341, 325, 28708, 261, 342, 28713, 261, 1143, 916, 272, 4039, 308, 1665, 302, 272, 1526, 2229, 28723, 2463, 5890, 288, 272, 27205, 448, 1526, 2229, 304, 272, 4920, 28725, 478, 541, 1221, 264, 633, 1350, 1197, 2229, 302, 272, 1665, 7768, 486, 272, 7650, 302, 268, 261, 15672, 264, 261, 15672, 268, 261, 28806, 28740, 15672, 264, 261, 28806, 28740, 842, 842, 842, 15672, 268, 320, 842, 2479, 478, 622, 625, 272, 1348, 6997, 5463, 2700, 390, 12084, 28723, 325, 28783, 609, 1791, 2648, 582, 28725, 297, 330, 14462, 28725, 478, 938, 272, 1348, 13640, 908, 387, 272, 22630, 5759, 387, 297, 1560, 23105, 395, 272, 865, 5133, 1250, 272, 2832, 302, 272, 2992, 7768, 28723, 816, 3084, 272, 26191, 28733, 1409, 354, 272, 9464, 297, 976, 6433, 28705, 28740, 395, 272, 11474, 12144, 288, 272, 1581, 26081, 302, 272, 6768, 1444, 272, 989, 23105, 28723, 13, 13, 11268, 13623, 28747, 4175, 6512, 4019, 2431, 28735, 13, 28705, 4175, 6512, 4019, 2431, 28735, 1791, 1369, 813, 2038, 28725, 478, 927, 5166, 20031, 10681, 396, 21322, 2487, 1312, 977, 288, 1581, 9796, 28723, 8469, 28725, 478, 1917, 272, 21322, 2487, 17734, 302, 272, 14972, 28755, 507, 10336, 25660, 325, 28757, 7814, 25660, 28731, 28705, 22756, 28724, 293, 5388, 370, 491, 796, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 1200, 690, 977, 274, 1712, 9796, 28747, 1876, 28725, 2338, 304, 1482, 28723, 13407, 272, 3298, 2844, 864, 297, 272, 11984, 3325, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 1200, 478, 13750, 1012, 2992, 989, 2421, 739, 14113, 288, 395, 272, 4466, 28723, 1263, 272, 17734, 4466, 28725, 272, 1132, 1665, 5532, 1560, 272, 2840, 304, 272, 13469, 302, 1430, 9353, 304, 272, 8802, 302, 3042, 302, 272, 2187, 28723, 560, 1745, 298, 3881, 272, 8015, 302, 1581, 15379, 22946, 1218, 28725, 478, 1917, 1712, 6472, 354, 1430, 4466, 28747, 351, 10485, 6098, 272, 1132, 1665, 390, 272, 15379, 28745, 24497, 6098, 6203, 390, 272, 15379, 28745, 393, 3402, 4915, 28753, 6098, 865, 272, 2840, 744, 302, 272, 1665, 390, 272, 15379, 28725, 579, 369, 1871, 28733, 3578, 378, 349, 15792, 298, 272, 24497, 5587, 562, 272, 1871, 349, 16687, 723, 10483, 297, 264, 2859, 28733, 13267, 1221, 28723, 1791, 8270, 272, 21322, 2487, 304, 6695, 352, 26882, 28725, 478, 5835, 264, 14031, 263, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 8073, 297, 272, 24497, 5587, 354, 1430, 302, 272, 9796, 354, 28705, 28740, 28755, 4466, 5944, 28723, 3489, 14031, 263, 9786, 541, 12049, 1876, 304, 2338, 3448, 1162, 304, 23463, 28723, 1263, 272, 1482, 3638, 28725, 272, 11499, 9703, 1899, 460, 1401, 28705, 28784, 28734, 28734, 562, 297, 272, 3798, 28725, 478, 541, 14631, 12564, 3822, 14006, 28723, 415, 3459, 4397, 302, 272, 6695, 352, 13466, 541, 347, 1419, 438, 7582, 28705, 28740, 28723, 816, 1388, 272, 312, 1674, 5496, 302, 1167, 10898, 11767, 390, 272, 21322, 2487, 26882, 384, 2187, 1200, 690, 7001, 28705, 28740, 28734, 28734, 28734, 21699, 2420, 28725, 304, 1917, 272, 9105, 2560, 4920, 390, 272, 7583, 298, 5902, 1698, 28705, 28740, 28734, 28734, 28734, 21699, 2420, 390, 272, 6695, 352, 13466, 384, 23969, 842, 816, 865, 938, 28705, 28740, 28734, 28734, 21699, 2420, 354, 272, 2191, 14107, 28725, 304, 272, 1749, 21699, 2420, 460, 1307, 1938, 534, 1465, 3881, 28723, 14862, 272, 2747, 21322, 2487, 26882, 28725, 478, 835, 3881, 26882, 7138, 486, 21690, 1949, 271, 5377, 14006, 28723, 1791, 511, 456, 28725, 478, 1917, 989, 6472, 28723, 4205, 28725, 478, 938, 264, 5509, 4920, 369, 11714, 8687, 346, 477, 272, 2992, 2764, 298, 5902, 28705, 28740, 28734, 28734, 28734, 21699, 2420, 28725, 304, 478, 1034, 456, 264, 5509, 13466, 28723, 7052, 28725, 478, 5835, 264, 9116, 28750, 966, 452, 431, 28705, 318, 950, 283, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 8073, 354, 28705, 28740, 28734, 28734, 28734, 21699, 2420, 304, 3870, 871, 312, 1674, 5496, 390, 272, 284, 28750, 28706, 13466, 28723, 11302, 28725, 478, 835, 13973, 544, 272, 2747, 26882, 3741, 272, 1482, 13466, 298, 1221, 264, 6750, 13466, 28725, 1298, 1814, 1331, 1077, 24541, 5712, 466, 17504, 5066, 9876, 438, 315, 3100, 28754, 28705, 28750, 28734, 28750, 28770, 690, 349, 2848, 298, 264, 11118, 5587, 739, 624, 659, 264, 2055, 302, 2659, 395, 624, 21322, 2487, 304, 938, 835, 272, 1178, 590, 5902, 298, 5835, 264, 2229, 28723, 13, 13, 11268, 13623, 28747, 365, 11257, 27247, 2255, 5771, 15347, 20255, 4866, 384, 2094, 26763, 28735, 13, 28705, 365, 11257, 27247, 2255, 5771, 15347, 20255, 4866, 384, 2094, 26763, 28735, 816, 11464, 9584, 813, 2038, 395, 365, 1998, 28705, 5727, 16075, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28708, 28731, 1200, 10107, 272, 365, 1998, 28732, 28734, 28731, 15944, 28723, 365, 1998, 28732, 28734, 28731, 907, 19770, 396, 4519, 28755, 477, 272, 21322, 2487, 13466, 304, 868, 1307, 272, 10898, 4519, 28755, 298, 3870, 272, 6695, 352, 13466, 304, 868, 6098, 1739, 8832, 27290, 282, 1366, 20392, 325, 4738, 28731, 298, 9443, 272, 4920, 28723, 816, 511, 459, 9584, 395, 799, 5562, 1854, 590, 2477, 2699, 3629, 4466, 14983, 28705, 5727, 16075, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28726, 28731, 2753, 28705, 11469, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 28705, 442, 938, 264, 5541, 28733, 1213, 2582, 5587, 28705, 11228, 491, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28731, 28705, 690, 1235, 459, 8040, 272, 1195, 300, 5318, 9796, 28723, 365, 1998, 28732, 28734, 28731, 10806, 15588, 298, 272, 351, 10485, 5587, 28723, 560, 1745, 298, 8018, 378, 298, 272, 367, 4341, 10485, 5587, 28725, 478, 938, 272, 3298, 7232, 288, 10777, 28705, 351, 19836, 911, 389, 28723, 325, 28750, 28734, 28740, 28782, 28731, 842, 560, 2830, 28725, 478, 7232, 28705, 28782, 22162, 13875, 298, 1221, 264, 1665, 354, 365, 1998, 28723, 1263, 272, 24497, 5587, 28725, 1159, 7232, 288, 28725, 272, 15379, 907, 15167, 1059, 264, 6431, 26656, 2524, 5722, 298, 9131, 4190, 28723, 415, 4693, 302, 272, 26656, 349, 13492, 390, 297, 7737, 567, 1609, 8041, 28716, 17739, 28745, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 28731, 842, 7829, 272, 4519, 28755, 304, 272, 4920, 460, 13492, 486, 396, 351, 11661, 395, 989, 7918, 13083, 304, 28705, 28740, 28750, 28783, 8007, 354, 1430, 7487, 28723, 415, 25159, 5572, 460, 8729, 298, 347, 22630, 28779, 28705, 12132, 2035, 911, 389, 28723, 325, 28750, 28734, 28740, 28782, 28731, 28705, 354, 272, 7918, 13083, 304, 17157, 28716, 354, 272, 3825, 7487, 28723, 13407, 272, 3493, 3830, 28725, 478, 7925, 1560, 26882, 298, 28705, 28787, 714, 28705, 28770, 354, 272, 4154, 304, 15150, 13466, 28723, 816, 5835, 272, 4994, 395, 272, 28705, 10534, 4041, 705, 567, 9236, 325, 28750, 28734, 28740, 28781, 28731, 28705, 6997, 7242, 354, 28705, 28782, 28734, 23784, 28713, 304, 868, 938, 272, 2229, 395, 272, 1489, 15150, 4320, 28723, 1791, 5407, 369, 478, 460, 459, 1524, 921, 1929, 27931, 272, 4397, 486, 3633, 1249, 5823, 28725, 478, 835, 3851, 6488, 1560, 272, 4850, 304, 8478, 302, 272, 351, 11661, 28725, 562, 863, 459, 1300, 707, 5247, 297, 4397, 28723, 1263, 813, 2038, 28725, 272, 399, 1383, 28755, 9786, 349, 12282, 2296, 28705, 14031, 263, 28733, 28728, 28740, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 395, 12734, 341, 4042, 3953, 304, 16710, 3320, 8925, 28723, 5800, 26259, 11031, 302, 14031, 263, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28740, 2753, 28705, 28750, 28734, 28750, 28770, 28731, 2405, 741, 633, 25556, 298, 4916, 4397, 28725, 478, 4987, 459, 298, 938, 706, 354, 272, 14408, 302, 25233, 28723, 816, 938, 264, 7191, 6084, 1665, 2764, 354, 813, 6878, 395, 28705, 28782, 28740, 28750, 16710, 3320, 304, 28705, 28740, 28750, 28783, 341, 4042, 3953, 13631, 304, 1300, 378, 6741, 1897, 274, 272, 4920, 4154, 1759, 298, 5902, 272, 26882, 28723, 1684, 1285, 5195, 272, 3469, 15379, 28725, 478, 1372, 272, 1348, 4693, 390, 7737, 567, 1609, 8041, 28716, 17739, 28745, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 28731, 1200, 1312, 354, 272, 2859, 28733, 13267, 13875, 28725, 272, 1285, 5722, 349, 13492, 390, 24247, 6023, 395, 1560, 2072, 304, 27493, 1250, 5621, 2458, 28723, 20956, 354, 272, 16710, 3320, 744, 302, 272, 1665, 1413, 264, 19348, 28779, 3601, 28705, 689, 969, 911, 389, 28723, 325, 28750, 28734, 28740, 28781, 28731, 1200, 544, 272, 799, 12167, 460, 351, 11661, 28713, 395, 28705, 28750, 7918, 13083, 304, 28705, 28740, 28750, 28783, 8007, 302, 1430, 7487, 28723, 816, 511, 459, 938, 707, 1933, 307, 1449, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 1200, 524, 28758, 19903, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 304, 524, 28758, 4549, 7161, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28750, 28740, 28731, 28705, 25556, 297, 272, 11354, 298, 8096, 272, 17656, 302, 272, 524, 28758, 1850, 28723, 1684, 1285, 5195, 2859, 28733, 13267, 15972, 28725, 478, 4662, 7975, 272, 1285, 5722, 8109, 288, 264, 24507, 380, 5165, 390, 1419, 297, 28705, 1091, 13744, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 816, 938, 652, 28705, 29152, 28733, 28711, 584, 298, 1003, 7264, 456, 2700, 28725, 304, 1854, 378, 312, 28733, 26911, 272, 27419, 1850, 28725, 478, 312, 28733, 7019, 272, 524, 28758, 1850, 25485, 298, 9087, 272, 7873, 28723, 1263, 6896, 28705, 28750, 302, 330, 14462, 28725, 478, 5835, 272, 4920, 354, 28705, 28782, 28734, 28734, 23784, 28713, 304, 28705, 28740, 28734, 28734, 21135, 5944, 354, 1430, 23784, 28723, 816, 2264, 4397, 302, 272, 4920, 477, 272, 1432, 23784, 1671, 707, 2935, 17203, 15117, 28723, 1791, 15350, 864, 9096, 11753, 302, 3633, 1249, 5823, 28725, 478, 5090, 938, 272, 10898, 1526, 2229, 302, 272, 14031, 263, 8073, 739, 28698, 813, 14107, 28725, 3741, 354, 272, 20575, 13466, 970, 478, 5835, 272, 2229, 477, 15147, 297, 6896, 28705, 28740, 28723, 1094, 2278, 7484, 349, 369, 28725, 5432, 14617, 710, 28733, 28247, 17628, 6741, 7263, 4397, 28725, 272, 2038, 1235, 459, 2225, 1162, 739, 478, 835, 5998, 272, 17628, 302, 272, 27205, 448, 4920, 28723, 816, 379, 664, 482, 369, 378, 349, 2940, 298, 27205, 448, 10086, 1250, 10231, 297, 741, 1862, 1187, 4250, 369, 590, 460, 9638, 298, 8957, 28723, 13, 13, 11268, 13623, 28747, 20308, 4874, 28735, 13, 28705, 20308, 4874, 28735, 415, 2191, 1204, 302, 813, 10367, 349, 4894, 297, 28705, 9240, 28705, 28750, 842, 816, 541, 1032, 369, 330, 14462, 12282, 575, 487, 14367, 365, 1998, 28732, 28734, 28731, 297, 544, 272, 4466, 6472, 28723, 330, 14462, 4510, 1855, 272, 15341, 4397, 356, 272, 24497, 5587, 28725, 562, 1019, 369, 575, 487, 14367, 365, 1998, 28732, 28734, 9572, 4915, 28753, 690, 541, 2735, 272, 1132, 4605, 28723, 816, 6836, 272, 1179, 4397, 298, 989, 6494, 28723, 4205, 28725, 272, 3814, 2229, 659, 264, 1873, 1178, 4479, 5463, 4338, 821, 272, 22668, 2229, 1096, 272, 3814, 2229, 349, 10898, 298, 9220, 1356, 2894, 15379, 22456, 28725, 1312, 272, 22668, 2229, 865, 4347, 2485, 533, 2430, 302, 272, 7768, 304, 865, 6782, 28713, 272, 6768, 28723, 7110, 28725, 272, 3814, 2229, 659, 2108, 4676, 298, 754, 7785, 304, 5312, 1873, 2952, 5463, 28723, 7052, 28725, 486, 15350, 2979, 272, 5566, 28725, 813, 2038, 341, 814, 274, 298, 1300, 396, 2992, 7768, 369, 8681, 298, 272, 1348, 14120, 28725, 459, 298, 9443, 272, 1132, 6768, 28723, 1263, 1287, 4918, 28725, 272, 15562, 460, 459, 5894, 297, 2035, 1070, 28723, 1263, 2757, 28725, 513, 264, 2930, 15588, 4274, 298, 272, 3500, 28725, 1854, 272, 3500, 1235, 459, 2318, 28725, 624, 3573, 1912, 910, 1188, 4274, 349, 1528, 1298, 1814, 1331, 1077, 24541, 5712, 466, 17504, 28705, 5066, 9876, 438, 315, 3100, 28754, 28705, 28750, 28734, 28750, 28770, 28705, 7589, 486, 8809, 15379, 28723, 415, 1348, 4620, 15588, 298, 272, 17734, 739, 2552, 9353, 28713, 460, 11422, 2940, 298, 272, 15816, 15596, 28723, 851, 1348, 20757, 349, 835, 9951, 297, 28705, 11228, 491, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28731, 842, 1602, 1235, 272, 4782, 302, 13466, 8015, 272, 2903, 28804, 4205, 28725, 390, 3349, 28725, 354, 544, 272, 3090, 1549, 302, 5562, 28725, 5998, 699, 2373, 272, 1348, 3638, 349, 1873, 821, 5998, 699, 1444, 1581, 9796, 28723, 560, 1167, 6472, 28725, 365, 1998, 28732, 28734, 9572, 4915, 28753, 349, 21973, 395, 330, 14462, 28723, 2993, 28725, 330, 14462, 480, 1303, 297, 3893, 28733, 5553, 5998, 28723, 20538, 739, 5998, 699, 1444, 1482, 304, 2338, 9796, 304, 5998, 699, 477, 1876, 298, 1482, 28725, 330, 14462, 575, 487, 14367, 272, 2388, 25465, 486, 264, 2475, 9829, 28725, 690, 12825, 272, 2967, 2952, 278, 2437, 302, 264, 3814, 2229, 754, 396, 22668, 2229, 28723, 816, 835, 1300, 369, 330, 14462, 2870, 10883, 1929, 1873, 938, 302, 1949, 271, 5377, 1178, 28723, 330, 14462, 12282, 575, 487, 14367, 2388, 25465, 739, 1413, 272, 284, 28750, 28706, 13466, 390, 272, 21322, 2487, 13466, 304, 575, 487, 14367, 1080, 3090, 1549, 739, 1413, 272, 20575, 13466, 390, 272, 21322, 2487, 13466, 28723, 11302, 28725, 739, 5998, 699, 477, 272, 6750, 13466, 28725, 3741, 354, 272, 351, 10485, 2751, 28725, 330, 14462, 575, 487, 14367, 799, 3090, 1549, 369, 5835, 272, 1526, 2229, 356, 624, 302, 272, 9430, 26882, 28570, 28723, 851, 1347, 14418, 272, 10431, 2437, 302, 264, 1526, 2229, 298, 347, 10898, 356, 264, 12836, 808, 302, 9021, 28725, 690, 829, 347, 680, 12302, 297, 1353, 28733, 9471, 22141, 28723, 1602, 349, 330, 14462, 19927, 486, 272, 15379, 968, 2045, 28804, 3880, 1327, 395, 365, 1998, 28732, 28734, 557, 330, 14462, 349, 3448, 16458, 298, 272, 4782, 302, 15379, 968, 2045, 28725, 562, 378, 1309, 2870, 264, 1628, 5133, 28723, 5800, 272, 15379, 5312, 272, 1348, 1871, 28725, 478, 1300, 330, 14462, 297, 272, 393, 3402, 4915, 28753, 5587, 25597, 1188, 1873, 821, 297, 272, 24497, 5587, 297, 544, 1369, 4469, 28723, 816, 6836, 378, 298, 272, 1639, 369, 2859, 28733, 5449, 2585, 15972, 506, 1457, 457, 1871, 304, 2405, 264, 991, 28709, 1039, 13894, 297, 272, 5566, 2764, 821, 272, 16626, 579, 369, 378, 541, 3084, 264, 680, 5857, 21135, 298, 8327, 272, 2992, 297, 2103, 28723, 6021, 783, 2979, 346, 28725, 5432, 2461, 2735, 298, 680, 1871, 28725, 330, 14462, 28733, 4915, 28753, 25597, 6871, 821, 330, 14462, 28733, 28758, 3402, 4915, 28753, 356, 5151, 28723, 415, 8964, 10467, 6881, 739, 5998, 699, 477, 284, 28750, 28706, 13466, 28723, 816, 379, 664, 482, 456, 298, 272, 1639, 272, 399, 1383, 28755, 1526, 2229, 302, 330, 14462, 28733, 4915, 28753, 349, 459, 10898, 1162, 395, 272, 2369, 11503, 28733, 11438, 28725, 562, 478, 340, 642, 3629, 11597, 298, 3437, 771, 28723, 1602, 1235, 272, 1474, 302, 6695, 697, 8015, 272, 4397, 28804, 816, 9086, 272, 534, 28733, 305, 352, 3881, 356, 284, 28750, 28706, 21322, 2487, 13466, 970, 478, 11204, 272, 1474, 302, 6695, 697, 2373, 371, 28740, 28725, 28705, 28750, 28725, 28705, 28782, 28725, 28705, 28740, 28734, 28725, 28705, 28750, 28734, 28725, 28705, 28782, 28734, 28725, 28705, 28740, 28734, 28734, 28725, 28705, 28750, 28734, 28734, 28725, 28705, 28782, 28734, 28734, 28725, 28705, 28740, 28734, 28734, 28734, 2051, 415, 1204, 349, 4894, 297, 28705, 9240, 28705, 28770, 842, 365, 1998, 28732, 28734, 28731, 25566, 395, 2859, 28733, 1056, 22141, 304, 9566, 3208, 438, 2429, 28705, 28740, 28734, 6695, 697, 298, 1147, 4119, 871, 3742, 4397, 28723, 560, 9349, 28725, 330, 14462, 6695, 1002, 1710, 840, 14204, 395, 390, 1664, 390, 28705, 28750, 21699, 2420, 28723, 851, 6695, 1002, 272, 11352, 7324, 12832, 302, 813, 2038, 28723, 11302, 28725, 272, 4397, 302, 330, 14462, 11478, 6488, 390, 680, 21699, 2420, 460, 3857, 5298, 28705, 28740, 28734, 28734, 28725, 690, 1347, 14418, 272, 10431, 2437, 302, 272, 2038, 28723, 816, 835, 5640, 369, 736, 349, 264, 12156, 297, 272, 4397, 11903, 438, 28705, 28782, 6695, 697, 304, 478, 379, 664, 482, 378, 349, 7334, 486, 2607, 6695, 697, 28723, 13, 13, 11268, 13623, 28747, 399, 2980, 14371, 394, 13351, 13, 28705, 399, 2980, 14371, 394, 13351, 315, 2162, 352, 5168, 477, 13875, 4258, 2123, 3791, 356, 503, 5174, 5168, 477, 865, 15379, 541, 347, 15756, 20577, 2458, 778, 989, 4938, 28725, 624, 2818, 356, 4519, 16023, 28705, 5727, 16075, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28708, 28731, 2753, 28705, 19803, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 2753, 28705, 11228, 491, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28731, 28705, 304, 624, 2818, 356, 1350, 1197, 19998, 283, 505, 503, 5174, 5168, 325, 11393, 2234, 28731, 28705, 10488, 567, 413, 1737, 266, 325, 28750, 28734, 28740, 28784, 28731, 2753, 28705, 5727, 16075, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28726, 28731, 2753, 28705, 11469, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 415, 6421, 5641, 302, 272, 907, 2071, 349, 298, 2822, 396, 4519, 28755, 369, 11745, 264, 1665, 8265, 5964, 298, 272, 2992, 369, 7334, 272, 8265, 28723, 28705, 5727, 16075, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28708, 28731, 2753, 28705, 19803, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 28705, 938, 272, 4519, 28755, 298, 3870, 272, 7583, 28742, 28713, 15379, 22456, 28725, 868, 12049, 272, 503, 5174, 5168, 2700, 395, 4787, 13779, 28723, 28705, 11228, 491, 911, 389, 28723, 325, 28750, 28734, 28740, 28783, 28731, 28705, 4976, 272, 4519, 28755, 298, 264, 5541, 28733, 12058, 286, 5587, 304, 1960, 23860, 272, 3638, 395, 1945, 19304, 302, 272, 5541, 21699, 695, 28723, 27588, 477, 1167, 5562, 28725, 813, 4431, 6098, 264, 3814, 2229, 298, 11319, 272, 4788, 302, 272, 21322, 2487, 28723, 560, 272, 1676, 2071, 302, 13945, 28725, 272, 6421, 5641, 349, 264, 14256, 10049, 369, 11731, 8137, 272, 6695, 1028, 28742, 28713, 304, 272, 8073, 28742, 28713, 15379, 21699, 2420, 28723, 2479, 272, 14256, 10049, 14449, 390, 264, 11499, 908, 28725, 304, 272, 8073, 28742, 28713, 4920, 349, 10898, 486, 399, 28758, 28705, 10488, 567, 413, 1737, 266, 325, 28750, 28734, 28740, 28784, 28731, 842, 1136, 264, 3924, 1435, 28725, 297, 1745, 298, 5835, 456, 14256, 10049, 272, 8073, 659, 298, 10876, 14113, 395, 272, 4466, 298, 7072, 7087, 11714, 28723, 27588, 477, 1167, 5562, 28725, 813, 2038, 1235, 459, 2699, 3629, 14983, 395, 272, 4466, 28725, 25748, 6129, 28733, 7063, 503, 5174, 477, 272, 6695, 352, 13466, 28723, 13, 13, 11268, 13623, 28747, 1298, 8521, 27205, 448, 8076, 297, 5161, 28733, 14727, 13, 28705, 1298, 8521, 27205, 448, 8076, 297, 5161, 28733, 14727, 5800, 5998, 699, 710, 28733, 28247, 4994, 659, 2727, 264, 18669, 4431, 297, 4229, 3842, 9457, 325, 28759, 11661, 28731, 28705, 6624, 2294, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 28731, 2753, 28705, 6444, 4401, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 28731, 2753, 28705, 365, 300, 4876, 4499, 911, 389, 28723, 325, 28750, 28734, 28750, 28740, 28731, 28705, 304, 659, 750, 2719, 680, 4387, 297, 6074, 8021, 325, 19667, 28731, 28705, 650, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 2753, 28705, 365, 300, 4876, 4499, 911, 389, 28723, 325, 28750, 28734, 28750, 28740, 28731, 1200, 312, 8521, 27205, 448, 8076, 349, 2108, 9604, 297, 272, 1834, 302, 5161, 28733, 14727, 28705, 3786, 283, 18404, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 4822, 6594, 3791, 3232, 356, 5998, 699, 10086, 28705, 19357, 911, 389, 28723, 325, 28750, 28734, 28740, 28787, 28731, 2753, 28705, 19803, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 2753, 28705, 3786, 283, 18404, 911, 389, 28723, 325, 28750, 28734, 28750, 28750, 28731, 842, 1418, 272, 799, 1021, 28725, 272, 1526, 2229, 28725, 264, 1212, 302, 6787, 20968, 2229, 28725, 369, 349, 21690, 10898, 486, 1008, 28733, 8554, 23313, 5168, 10427, 2910, 272, 5391, 5097, 302, 2229, 28733, 5527, 25234, 466, 5168, 7737, 567, 1609, 8041, 28716, 17739, 28745, 28705, 382, 2015, 1079, 911, 389, 28723, 325, 28750, 28734, 28740, 28774, 2753, 259, 28750, 28734, 28750, 28734, 2753, 28705, 28750, 28734, 28750, 28740, 28745, 28705, 28750, 28734, 28750, 28770, 344, 28705, 23181, 2950, 17747, 27039, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 2753, 28705, 1609, 28712, 785, 28727, 497, 263, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 842, 2993, 28725, 272, 5998, 2437, 302, 1167, 1298, 1814, 1331, 1077, 24541, 5712, 466, 17504, 5066, 9876, 438, 315, 3100, 28754, 28705, 28750, 28734, 28750, 28770, 1526, 4994, 349, 459, 1162, 28733, 13082, 823, 28723, 28705, 318, 950, 283, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 2822, 28713, 264, 4920, 486, 1413, 264, 710, 28733, 28247, 1526, 2229, 477, 23083, 1178, 304, 6695, 1002, 11352, 6129, 28733, 7063, 304, 1664, 28733, 7063, 4397, 28723, 816, 4916, 3714, 456, 5007, 486, 14328, 264, 1581, 5587, 28725, 613, 28723, 28706, 28723, 503, 5174, 5168, 28723, 560, 2830, 28725, 478, 16287, 272, 3638, 298, 272, 2229, 486, 3977, 288, 272, 7583, 1312, 28705, 318, 950, 283, 911, 389, 28723, 325, 28750, 28734, 28750, 28734, 28731, 28705, 1960, 23860, 272, 3638, 486, 264, 3414, 5307, 11499, 908, 690, 349, 2108, 14494, 297, 264, 1353, 28733, 9471, 5587, 28723, 13, 13, 11268, 13623, 28747, 15601, 24164, 10946, 567, 4192, 3100, 2252, 1702, 13, 28705, 15601, 24164, 10946, 567, 4192, 3100, 2252, 1702, 560, 456, 3830, 28725, 478, 2169, 330, 14462, 28725, 264, 2229, 28733, 5527, 2038, 354, 503, 5174, 477, 13875, 28723, 415, 6421, 302, 272, 2038, 12573, 1046, 272, 1982, 302, 264, 710, 28733, 28247, 1526, 2229, 304, 297, 740, 274, 378, 275, 28723, 28712, 28723, 28707, 28723, 2992, 2787, 486, 3344, 272, 4317, 2785, 28723, 1418, 272, 17734, 21322, 2487, 477, 272, 384, 7814, 25660, 28725, 478, 16499, 11352, 4397, 6731, 298, 2388, 25465, 28725, 1019, 739, 741, 2388, 25465, 541, 2735, 272, 1132, 1665, 28723, 5800, 330, 14462, 25597, 1162, 28725, 736, 460, 1309, 1287, 9658, 28723, 4205, 28725, 10589, 7444, 14631, 2663, 395, 8021, 28723, 5800, 330, 14462, 3791, 3448, 1162, 297, 272, 24497, 5587, 28725, 736, 349, 1309, 264, 2475, 10467, 6731, 395, 272, 393, 3402, 4915, 28753, 5587, 970, 272, 2859, 28733, 4404, 15972, 460, 7975, 28723, 816, 6836, 456, 298, 272, 1639, 369, 272, 4320, 5439, 302, 272, 18012, 27419, 4320, 993, 459, 347, 7898, 2066, 298, 1914, 272, 21135, 2038, 298, 1300, 396, 13387, 1179, 5165, 28723, 7052, 28725, 297, 456, 3830, 28725, 478, 865, 3881, 272, 1080, 3588, 5587, 970, 1560, 272, 21322, 2487, 304, 16082, 9866, 460, 6321, 2673, 9796, 28723, 1418, 272, 799, 1021, 28725, 10589, 14631, 2663, 297, 264, 4008, 28733, 9701, 10403, 304, 541, 835, 503, 9647, 8222, 4636, 2187, 349, 459, 1019, 3684, 298, 10589, 4135, 5855, 897, 288, 1167, 19573, 622, 1565, 582, 18053, 298, 5998, 2673, 1581, 21322, 8447, 304, 1019, 5090, 477, 2930, 10506, 28723, 14671, 28725, 354, 741, 9796, 28725, 1019, 10589, 3573, 6619, 6129, 28733, 7063, 503, 5174, 486, 865, 6265, 2663, 28723, 851, 993, 2940, 298, 272, 3638, 28742, 28713, 17599, 442, 4716, 10077, 27195, 6266, 28723, 1537, 28725, 1019, 395, 4979, 13126, 28725, 10589, 1309, 927, 298, 2844, 864, 297, 272, 4466, 298, 12049, 741, 9796, 28723, 851, 8972, 1002, 396, 3270, 5168, 6896, 28705, 28770, 390, 396, 8223, 298, 813, 10782, 28723, 816, 340, 642, 1167, 13817, 298, 3437, 771, 28723, 13, 733, 28748, 16289, 28793, 733, 7384, 921, 617, 304, 7092, 884, 28793, 13, 28789, 2012, 5174, 5168, 2038, 330, 14462, 28767, 415, 330, 14462, 10782, 15890, 396, 503, 5174, 5168, 2038, 369, 2822, 28713, 264, 1526, 2229, 477, 2609, 2659, 28725, 304, 868, 6098, 369, 2229, 298, 13701, 6768, 477, 396, 15379, 28733, 5041, 6695, 352, 28725, 1347, 28717, 3706, 264, 4842, 4431, 298, 25234, 466, 5168, 28723, 13, 28789, 27869, 298, 5998, 5168, 9804, 28767, 415, 3830, 3342, 274, 5853, 3684, 1218, 1444, 330, 14462, 304, 5998, 5168, 9804, 297, 7768, 27472, 3332, 28725, 690, 993, 2169, 264, 7092, 10403, 354, 25234, 466, 5168, 28723, 13, 13, 28792, 28753, 322, 2256, 6494, 354, 19871, 28793, 13, 28789, 3490, 566, 14593, 9520, 304, 2903, 28767, 415, 3830, 1424, 294, 26344, 10181, 272, 4782, 302, 2388, 25465, 28725, 14593, 9520, 28725, 304, 8260, 356, 5166, 15312, 20031, 28725, 7501, 2967, 5566, 302, 272, 2952, 463, 2437, 304, 1397, 294, 2426, 302, 272, 330, 14462, 10782, 28723, 13, 28789, 856, 28717, 864, 304, 2380, 522, 14909, 28767, 415, 3830, 349, 7567, 297, 264, 3588, 304, 2380, 522, 8717, 28725, 2492, 378, 14494, 298, 264, 5335, 7938, 28723, 13, 13, 28792, 28753, 322, 2256, 6494, 354, 312, 7183, 28793, 13, 28789, 1733, 6206, 1581, 6752, 477, 6594, 13945, 28767, 415, 7196, 302, 330, 14462, 754, 6594, 5562, 1259, 390, 365, 1998, 304, 2229, 28733, 5527, 25234, 466, 5168, 13945, 460, 459, 15956, 10535, 575, 28725, 304, 264, 3081, 263, 1581, 6752, 993, 347, 4892, 354, 19871, 28723, 13, 28789, 24241, 1345, 15197, 297, 3588, 6472, 28767, 415, 15197, 349, 8659, 298, 347, 356, 10102, 3588, 9796, 28725, 304, 736, 349, 264, 19368, 298, 9584, 330, 14462, 298, 799, 18539, 297, 680, 4630, 6472, 28723, 13, 13, 28792, 28735, 16939, 594, 354, 14204, 28793, 13, 28789, 966, 6026, 1581, 6752, 477, 5202, 13945, 28767, 415, 11739, 829, 1388, 727, 298, 15956, 4897, 575, 272, 11090, 1444, 330, 14462, 304, 6594, 5562, 737, 365, 1998, 304, 2229, 28733, 5527, 25234, 466, 5168, 13945, 28723, 13, 28789, 27869, 297, 680, 4630, 6472, 28767, 661, 682, 347, 21158, 298, 9584, 330, 14462, 298, 799, 18539, 297, 680, 4630, 6472, 298, 1347, 2210, 871, 4397, 297, 7625, 3796, 5168, 20031, 28723, 13, 13, 2, 28705, 13] #################### encoded labels #################### [-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 733, 7384, 921, 617, 304, 7092, 884, 28793, 13, 28789, 2012, 5174, 5168, 2038, 330, 14462, 28767, 415, 330, 14462, 10782, 15890, 396, 503, 5174, 5168, 2038, 369, 2822, 28713, 264, 1526, 2229, 477, 2609, 2659, 28725, 304, 868, 6098, 369, 2229, 298, 13701, 6768, 477, 396, 15379, 28733, 5041, 6695, 352, 28725, 1347, 28717, 3706, 264, 4842, 4431, 298, 25234, 466, 5168, 28723, 13, 28789, 27869, 298, 5998, 5168, 9804, 28767, 415, 3830, 3342, 274, 5853, 3684, 1218, 1444, 330, 14462, 304, 5998, 5168, 9804, 297, 7768, 27472, 3332, 28725, 690, 993, 2169, 264, 7092, 10403, 354, 25234, 466, 5168, 28723, 13, 13, 28792, 28753, 322, 2256, 6494, 354, 19871, 28793, 13, 28789, 3490, 566, 14593, 9520, 304, 2903, 28767, 415, 3830, 1424, 294, 26344, 10181, 272, 4782, 302, 2388, 25465, 28725, 14593, 9520, 28725, 304, 8260, 356, 5166, 15312, 20031, 28725, 7501, 2967, 5566, 302, 272, 2952, 463, 2437, 304, 1397, 294, 2426, 302, 272, 330, 14462, 10782, 28723, 13, 28789, 856, 28717, 864, 304, 2380, 522, 14909, 28767, 415, 3830, 349, 7567, 297, 264, 3588, 304, 2380, 522, 8717, 28725, 2492, 378, 14494, 298, 264, 5335, 7938, 28723, 13, 13, 28792, 28753, 322, 2256, 6494, 354, 312, 7183, 28793, 13, 28789, 1733, 6206, 1581, 6752, 477, 6594, 13945, 28767, 415, 7196, 302, 330, 14462, 754, 6594, 5562, 1259, 390, 365, 1998, 304, 2229, 28733, 5527, 25234, 466, 5168, 13945, 460, 459, 15956, 10535, 575, 28725, 304, 264, 3081, 263, 1581, 6752, 993, 347, 4892, 354, 19871, 28723, 13, 28789, 24241, 1345, 15197, 297, 3588, 6472, 28767, 415, 15197, 349, 8659, 298, 347, 356, 10102, 3588, 9796, 28725, 304, 736, 349, 264, 19368, 298, 9584, 330, 14462, 298, 799, 18539, 297, 680, 4630, 6472, 28723, 13, 13, 28792, 28735, 16939, 594, 354, 14204, 28793, 13, 28789, 966, 6026, 1581, 6752, 477, 5202, 13945, 28767, 415, 11739, 829, 1388, 727, 298, 15956, 4897, 575, 272, 11090, 1444, 330, 14462, 304, 6594, 5562, 737, 365, 1998, 304, 2229, 28733, 5527, 25234, 466, 5168, 13945, 28723, 13, 28789, 27869, 297, 680, 4630, 6472, 28767, 661, 682, 347, 21158, 298, 9584, 330, 14462, 298, 799, 18539, 297, 680, 4630, 6472, 298, 1347, 2210, 871, 4397, 297, 7625, 3796, 5168, 20031, 28723, 13, 13, 2, -100, -100]