TESTING ENVIRONMENT Sample Clauses

TESTING ENVIRONMENT. You shall provide a testing environment that meets all Precision Exams system requirements.
AutoNDA by SimpleDocs
TESTING ENVIRONMENT. The subscription includes one or more testing environments, as set forth in the End User’s agreement with the Bizagi Authorized Reseller, that are used to conduct user-acceptance tests of the End User’s Applications (the “Testing Environment”). The End User is responsible for developing the End User’s Applications in Studio Cloud Services and deploying them in the Testing Environment(s). This is performed using an upload function available in the Management Console.
TESTING ENVIRONMENT. Prior to the delivery of each Deliverable, Client will be able to test such Deliverable in a testing environment. Xxxxx Blocks will provide Client access to this testing environment by sending a location (URL) and, if necessary, login details. Access to the testing environment is strictly tied to Client. Client is not allowed to gain third parties access to this testing environment by sending the location (URL) and/or the login details to any third party, unless expressly indicated to the contrary in the Agreement. Client is aware that the testing environment is not suitable for production purposes, in whatever form. Client is not allowed to use the testing environment for such purposes. Xxxxx Blocks gives no guarantees regarding the availability, completeness and correct operation of the testing environment. Betty Blocks, however, is aware that the availability of the testing environment is necessary for delivery of the Deliverables. Therefore, Betty Blocks will endeavor to keep the testing environment available.
TESTING ENVIRONMENT. KPIs will extract information about the quality and performance. To reach comparable data it is necessary to know the position of the vehicle initiating an eCall. During the field test phase, eCalls will be performed in determined periods of time without having real collisions. For the HeERO field test the samples will not be integrated into the car. As a result a series production process, performance indicators like shock resistance and backup battery availability without main power supply, will not be available. All parameters necessary for the evaluation of the listed KPIs will be logged in the IVS and the PSAPs.
TESTING ENVIRONMENT. The over-all eCall chain will be tested in the ENT eCall laboratory. The necessary pre- requisite is the full component compliance with the eCall standards. The eCall initiation will be performed according the particular scenario set-up, as outlined earlier in this document. All Group 1 KPIs (KPI1 - KPI5) will be measured and observed simultaneously during the T&V procedure. Measurement procedures for the related KPIs are outlined below. They apply to planned scenarios L1 - L7. The overall eCall chain will be tested in real environment, following the scenario specifications outlined above. The necessary pre-requisite is the full component compliance with the eCall standards. The eCall initiation will be performed according the particular scenario set-up, as outlined earlier. All Group 2 KPIs (KPI1 - KPI5) will be measured and observed simultaneously during the T&V procedure, or post-processed using the records in the event logs. Measurement procedures for the related KPIs are outlined below. They apply to planned scenarios R1 - R4. The eCall SOP will be tested in real environment, following the scenario specifications outlined earlier in this document. The necessary pre-requisite is the full component compliance with the eCall standards. The eCall initiation will be performed according the particular scenario set-up, as outlined earlier in this document. All Group 2 KPIs (KPI1 - KPI5) will be measured and observed simultaneously during the T&V procedure, or post- D4.1 Draft KPIs, test specification and methodology processed using the records in the event logs. Measurement procedures for the related KPIs are outlined below. They apply to planned scenarios S1 - S2. The objective of the position estimation for eCall T&V is to identify the requirements for position estimation for eCall and to assess GPS positioning performance form the eCall perspective, regardless of the type of GPS receiver utilised. The intention of this T&V segment is to objectively assess risks of the sole GPS utilisation, and to propose the minimum requirements and system integration architecture for the appropriate position estimation procedure for eCall. In due course, the Group 4 T&V scenarios will be conducted separately from the rest of the eCall T&V, with utilisation of the GPS receivers of the same quality levels such as the receivers to be utilised in the eCall IVS. In the first phase of the eCall Group 4 T&V scenarios performance, GPS, GPS+EGNOS and GPS/GLONASS will be utilised....
TESTING ENVIRONMENT. During tests, the time stamp clock of the data recorded in log files will be synchronized using GPS received time, both in the vehicle and in the PSAP. Log files will be produced in vehicles IVSs and in the PSAP to provide data for the HeERO Key Performance Indicators (KPIs) calculation in order to perform a common evaluation within the projects national pilots. Log files will be agreed among Italian Pilot Partners according to HeERO KPIs’ requirements agreed by all partners. All IVSs used in the vehicles fleet in Italy however will adopt the same log file format. The accuracy of the reported position versus actual position (dedicated IVS tests, not performed during pilot tests in Varese. These tests will be performed with reference positions acquired also by a GPS receiver with differential correction (Real Time Kinematic)) will also be evaluated. Time stamps and data to be logged for KPI: • IVS, for each initiated eCall: o ID number (incremental counter) of the eCall o Manual/automatic trigger activation o T0_IVS – incident detected o T1_IVS – IVS start sending the eCall o MSD contents o Transmission attempts o T2_IVS – Voice channel is active and driver and operator communication established o T3_IVS – Voice connection is ended o eCall communication OK/NOK • TELECOM OPERATOR o T0_MNO – the eCall reaches the 112 telecom operator network D4.1 Draft KPIs, test specification and methodology o T1_MNO – the eCall flag is managed • PSAP VARESE o T0_PSAP – the eCall reaches the PSAP o T1_PSAP – the MSD reaches the PSAP o T2_PSAP – the processed MSD is being presented to the PSAP Operator o T3_PSAP – voice channel is active and operator can communicate with the driver o T4_PSAP – VIN has been decoded with EUCARIS tool o T5_PSAP – the PSAP alerts (is ready to alert) the emergency agencies KPI- Analysis on MSD transmission time: • Minimum, Maximum and Average transmission time for the MSD correctly received at PSAP • Distribution of the MSD transmission time • Number of non-successful MSD transmission attempt (longer than [20 sec] maximum) Start time for MSD transmission: when the IVS starts to send the SYNC signal End time: when the CRC has been detected as correct by the PSAP modem Target: 90% of all MSD transmission times shall be below 15 sec KPI-Analysis on voice channel and disturbance/blocking • Duration of voice connection (IVS, PSAP) • Individual description of disturbance in voice communication (due to In-band Transmission during Manual/automatic trigger...
TESTING ENVIRONMENT. 1. Testing the eCall reception 1.1 Test the PSAP modem All the following tests will analyse the performance of the PSAP modem and its behaviour in different scenarios. • with or without the eCall flag These tests will show how the system will handle calls with or without the eCall flag. As the implementation of the eCall flag is the responsibility of the MNOs, we are not sure when we’ll be able to test this feature. • for standard eCall (the IVS calls the PSAP) These tests will provide information about the normal dataflow for eCall. • IVS redial These tests will be performed to observe the behaviour of the system when the IVS attempts a redial after the connection has been interrupted. • Call-back (the PSAP calls the IVS) These tests will analyse how the system behaves in case of a call-back. The call-back function will be used in case an on-going call is interrupted for D4.1 Draft KPIs, test specification and methodology various reasons. After the call is interrupted, the IVS will try to establish a connection with the PSAP and if the IVS won’t be able to so, the operator will have to use the call-back feature. • Redundancy We will test the redundancy of the PSAP modem and different failover strategies. • Test more calls coming at the same time These tests will analyse the behaviour of the PSAP modem in case of more eCalls at the same time. 1.2 Test the voice connection with the operator These tests will concentrate on the eCall voice path characteristics from the 112 operator’s point of view. These tests will include, but won’t be limited to: answering a call, transferring a call, ending a call. 1.3 Test the call-back feature from the operator’s point of view These tests will analyse the call-back functionality from the 112 operator’s point of view. These tests will observe the minimum and maximum wait time before attempting call-back. 1.4 Evaluate the defined KPIs We will evaluate all the defined KPISs using the data gathered during the tests for eCall reception. 2. Test the MSD reception from the point of view of the operator 2.1 Evaluate the usability of the operator interface This evaluation will help analyse if the proposed operator interface is user friendly. 2.2 Test the reception of the MSD in the 112 application We will test to see if the MSD is being decoded correctly and if the information is being presented to the operator. 2.3 Test the “resend MSD” functionality These tests will analyse the behaviour of the system in case the operator asks ...
AutoNDA by SimpleDocs
TESTING ENVIRONMENT. More detailed test plans and procedures will be part of the test preparations. D4.1 Draft KPIs, test specification and methodology Area Location Number of vehicles Roaming vehicles Active eCall / Dormant Mode - eCall only Automatic / Manual initiated eCall Signal strengthevaluation KPI 1 - Success rateeCall KPI 2 - Success rate MSD KPI 3 - Voice channel blocking KPI 4 - Voice channel disturbance KPI 5 - Weak signal behavior U1 URBAN Göteborg Center 3 N Active A Yes x x x U2 Göteborg Center 1 N Active M Yes x x x x H1 HIGHWAY Highway 3 N Active A Yes x x x H2 Highway 1 N Active M Yes x x x x R1 RURAL Small Roads 3 N Active A Yes x x x R2 Small Roads 1 N Active M Yes x x x x R3 Small Roads 3 Y Active A Yes x x x L1 LABORATORY Karlskrona Laboratory 0 N Active M Yes x x x (x) x L2 Karlskrona Laboratory 0 N Dormant M Yes x x x (x) x L3 Karlskrona Laboratory 0 Y Active M Yes x x x (x) Roaming vehicle = Telia SIM-card accessing Telenor and vice versa. Manual eCall are excecuted by physical push on SOS button, dedicated operator is answering. Automatic eCall: generated within the car, to collect large amount of performed tests. Time staps of "all events" during the eCall are recorded both in IVS and the test PSAP.
TESTING ENVIRONMENT. For each project phase the following tests are performed (in given order): 1. System tests • During the system test phase the separate system components are tested; the components are regarded as stand-alone, independent components. The following system components are distinguished: o Vehicle (IVS modem) o GSM / UMTS network o Stand-alone PSAP modem o Alarmcentrale 1-1-2 (PSAP1), with an integrated PSAP modem o Interface Alarmcentrale 1-1-2 / Rijkswaterstaat o Interface Alarmcentrale 1-1-2 / EUCARIS o Meldkamercentrale (PSAP2) • Testing of the component ‘Vehicle (IVS modem)’ is entirely on responsibility of the IVS modem supplier. Testing of the component ‘GSM / UMTS network’ is entirely on responsibility of the network providers (MNO’s). • System tests are performed in the lab environment (i.e. fully controlled environment), meaning that no operational users are involved. Tests are strictly related to the technical working of the equipment. 2. Integration tests • During the integration tests phase focus is placed on the end-to-end working of the connected system components and the ‘functionality’. • Integration tests are performed in the lab environment, meaning that no operational users are involved. Tests are strictly related to the technical working of the equipment. 3. Performance tests • During the performance tests phase focus is placed on the process of handling an eCall. Attention is paid to following aspects: usability and performance criteria. • During the performance tests the eCall test fleet is involved in drive testing. Performance tests are performed by operational users. • The key performance criteria that are used during the performance tests phase are: o Percentage of success (also known as ‘Success rate’) with variation in signal strength, caused by variation in clutter density o Duration as defined by the time table Tests are always based on a predefined set of test scenarios. Each set of test scenarios is defined before a particular test cycle is started. Next to the system tests, integration tests and performance tests in a laboratory environment testing by driving a pre-defined route will be done. The emphasis is on testing realistic combinations of eCalls under different circumstances. The eCalls will be generated in the field, all at a time. Emphasis is also on the processes in PSAP and TMC with the information generated by the eCall. In every scenario all the KPIs will be taken into account. D4.1 Draft KPIs, test specification an...
TESTING ENVIRONMENT. If you attempt to or otherwise use any testnet, services labeled as “beta” or “demo”, or sandbox services on our Website or linked websites (collectively “Testing Environments”), you agree to use the test credentials and data provided by us, or otherwise provide fake information. You will not provide any real or otherwise personally identifiable information (e.g. SSN), sensitive financial information (e.g. username or password), or any information which you do not own or you are not authorized to use in our Testing Environments. As further detailed in Section 7 (No Warranty) All Testing Environments are provided “as is” and “as available” without any warranty and you should consider all tokens and funds at risks of loss. For avoidance of doubt, this section does not relate to any Website services relating to Issuer’s offerings or services.
Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!