Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Version History

« Previous Version 3 Next »

The WFS cartridge is using similar methods as default import scripts described on our website.

However, to fully understand how it works, I advise you to take a closer look at this documentation.

Importing WFS tables into Earthlight is more often than not, a two-stage process. That is why we have split this documentation into: 1 – creating a list of tables held by WFS service, 2 – importing the table from WFS service to Earthlight database.

Stage I - The first stage is to learn what resources are hosted by the WFS service and what their names are.

In most situations, we won't really know what tables are held by the WFS service. However, in order to upload them to the database, we need to provide the table name parameter in URL. Data Pump offers a possibility to create a CSV file which will contain details about all tables held by the particular WFS host. Here you can see an example script which is used for that purpose:

<?xml version="1.0" encoding="utf-8"?>
<Script>
 <Actions>
    <Action p3:type="Load" xmlns:p3="http://www.w3.org/2001/XMLSchema-instance">
      <Source>cartridge=wfs;data source="http://www.geostore.com/OGC/OGCInterface?SERVICE=WFS&UID=id&PASSWORD=password&INTERFACE=ENVIRONMENTWFS&VERSION=1.0.0";table name=""</Source>
      <Destination>c:\temp\ListOfAvailableTables.csv</Destination>
    </Action>
 </Actions>
</Script> 

As you can see the script structure is similar to a typical import script. In the <Action> section we are providing information about Source and Destination. However, there are few changes which I would like to discuss:

Source

  1. Please indicate the cartridge type (in all cases it will be: cartridge=wfs),
  2. In data source="" (WFS address in quotes) we need to provide the URL to the chosen WFS service:
    * Please remember to never specify the "request" and "typeName" parameters in the URL. Data Pump needs to set them explicitly and it won't be able to do this if they are already provided in URL,
    * If there is no VERSION parameter provided in URL, then DataPump will set it automatically to 2.0.0,
    * Because the script file is parsed into XML, it is crucial to replace the “&” with “&amp;” otherwise the import operation will fail.
  3. Finally, we need to specify the table name="" parameter:
    * In this case, please leave the parameter empty in order to list all tables held by the WFS service.

Important

Please remember that only the script file is parsed into XML. Therefore when you are using the 3 file scenario (.script, .destination, .source), then the .source file shouldn’t have escaped &amp; inside of it (leave the “&” as they were original).


Destination

We need to provide a path to the location where the CSV file will be created with a list of all tables held by the WFS service (used only in conjunction with empty table name="" parameter from <source>)

On the following screenshot, you can see the content of the CSV file in which we can find details about all tables held by the particular WFS host

Stage II - When we know the names of the available tables, we ca n proceed to stage two – importing the data into Earthlight.

There is not much that, we will need to change in order to import the specified table into the database.

First, please take a look at an example .script file which can be used to import the "ea-wfs-areas_public_face_inspire" table to database:

<?xml version="1.0" encoding="utf-8"?>
<Script>
 <Actions>
    <Action p3:type="Load" xmlns:p3="http://www.w3.org/2001/XMLSchema-instance">
      <Source>cartridge=wfs;data source="http://www.geostore.com/OGC/OGCInterface?SERVICE=WFS&UID=UDATAGOV2011&PASSWORD=datagov2011&INTERFACE=ENVIRONMENTWFS&VERSION=1.0.0";table name="ea-wfs-areas_public_face_inspire"</Source>
      <Destination>database=statmap;user id=your_id;password=your_password;timeout=15;pooling=True;enlist=False;integrated security=False;initial catalog=gisdb;custom port=1033;data source=your_datasource\sqlexpress;cartridge=SqlServer;table name=areas_of_onb_inspire;schema=dbo</Destination>
    </Action>
 </Actions>
</Script>

Source

  1. Please indicate the cartridge type (in all cases it will be: cartridge=wfs),
  2. In data source="" (WFS address in quotes) we need to provide the URL to the chosen WFS service:
    * Please remember to never specify the "request" and "typeName" parameters in the URL. Data Pump needs to set them explicitly and it won't be able to do this if they are already provided in URL,
    * If there is no VERSION parameter provided in URL, then DataPump will set it automatically to 2.0.0,
    * Because the script file is parsed into XML, it is crucial to replace the “&” with “&amp;” otherwise the import operation will fail.
  3. Finally, we need to specify the table name="" parameter:
    * In this case, please leave the parameter empty in order to list all tables held by the WFS service.

IMPORTANT

Please remember that only the script file is parsed into XML. Therefore when you are using the 3 file scenario (.script, .destination, .source), then the .source file shouldn’t have escaped &amp; inside of it (leave the “&” as they were original).


Destination

- The connection string to the database syntax is same as for the typical import scripts. In this case, the name under which data will be held in the database needs to be provided in table name parameter (e.g.: table name=areas_of_onb_inspire), as part of the entire connection string.

IMPORTANT

Please note that we are not providing quotes in table name parameter within destination section.

Please remember that some characters which were placed in original name (provided by WFS service) will not be accepted by the database (e.g. “-”or “:”). In such scenario, we can’t simply copy the name into the table name parameter without amending it first.

For more information about DataPump import scenarios, please visit: DataPump Import

I hope this guide will be helpful for creating your WFS import scripts. If you have any questions you can always contact the StatMap Support for an advice.


  • No labels