Implement ALTER TABLE clause for MySQL migrations.

The new ALTER TABLE facility allows to act on tables found in the MySQL
database before the migration happens. In this patch the only provided
actions are RENAME TO and SET SCHEMA, which fixes #224.

In order to be able to provide the same option for MS SQL users, we will
have to make it work at the SCHEMA level (ALTER SCHEMA ... RENAME TO
...) and modify the internal schema-struct so that the schema slot of
our table instances are a schema instance rather than its name.

Lacking MS SQL test database and instance, the facility is not yet
provided for that source type.
This commit is contained in:
Dimitri Fontaine 2016-03-06 21:40:37 +01:00
parent d4737a39ca
commit c724018840
18 changed files with 287 additions and 95 deletions

4
debian/control vendored
View File

@ -3,7 +3,7 @@ Section: database
Priority: extra Priority: extra
Maintainer: Dimitri Fontaine <dim@tapoueh.org> Maintainer: Dimitri Fontaine <dim@tapoueh.org>
Uploaders: Christoph Berg <myon@debian.org> Uploaders: Christoph Berg <myon@debian.org>
Build-Depends: debhelper (>= 8.0.0), sbcl (>= 1.1.13), ruby-ronn, buildapp (>= 1.5), cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-ixf, gawk, cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-mssql, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities Build-Depends: debhelper (>= 8.0.0), sbcl (>= 1.1.13), ruby-ronn, buildapp (>= 1.5), cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-ixf, gawk, cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-mssql, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities, cl-ppcre
Standards-Version: 3.9.6 Standards-Version: 3.9.6
Homepage: https://github.com/dimitri/pgloader Homepage: https://github.com/dimitri/pgloader
Vcs-Git: https://github.com/dimitri/pgloader.git Vcs-Git: https://github.com/dimitri/pgloader.git
@ -28,7 +28,7 @@ Description: extract, transform and load data into PostgreSQL
Package: cl-pgloader Package: cl-pgloader
Architecture: all Architecture: all
Depends: ${misc:Depends}, cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities Depends: ${misc:Depends}, cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities, cl-ppcre
Description: extract, transform and load data into PostgreSQL Description: extract, transform and load data into PostgreSQL
pgloader imports data from different kind of sources and COPY it into pgloader imports data from different kind of sources and COPY it into
PostgreSQL. PostgreSQL.

View File

@ -1725,6 +1725,8 @@ LOAD DATABASE
\-\- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, \'actor\' \-\- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, \'actor\'
\-\- EXCLUDING TABLE NAMES MATCHING ~<ory> \-\- EXCLUDING TABLE NAMES MATCHING ~<ory>
\-\- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8 \-\- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
\-\- ALTER TABLE NAMES MATCHING \'film\' RENAME TO \'films\'
\-\- ALTER TABLE NAMES MATCHING ~/_list$/ SET SCHEMA \'mv\'
BEFORE LOAD DO BEFORE LOAD DO
$$ create schema if not exists sakila; $$; $$ create schema if not exists sakila; $$;
@ -2104,6 +2106,31 @@ DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
.IP .IP
You can use as many such rules as you need, all with possibly different encodings\. You can use as many such rules as you need, all with possibly different encodings\.
. .
.IP "\(bu" 4
\fIALTER TABLE NAMES MATCHING\fR
.
.IP
Introduce a comma separated list of table names or \fIregular expressions\fR that you want to target in the pgloader \fIALTER TABLE\fR command\. The only two available actions are \fISET SCHEMA\fR and \fIRENAME TO\fR, both take a quoted string as parameter:
.
.IP "" 4
.
.nf
ALTER TABLE NAMES MATCHING ~/_list$/, \'sales_by_store\', ~/sales_by/
SET SCHEMA \'mv\'
ALTER TABLE NAMES MATCHING \'film\' RENAME TO \'films\'
.
.fi
.
.IP "" 0
.
.IP
You can use as many such rules as you need\. The list of tables to be migrated is searched in pgloader memory against the \fIALTER TABLE\fR matching rules, and for each command pgloader stops at the first matching criteria (regexp or string)\.
.
.IP
No \fIALTER TABLE\fR command is sent to PostgreSQL, the modification happens at the level of the pgloader in\-memory representation of your source database schema\. In case of a name change, the mapping is kept and reused in the \fIforeign key\fR and \fIindex\fR support\.
.
.IP "" 0 .IP "" 0
. .
.SS "LIMITATIONS" .SS "LIMITATIONS"

View File

@ -1474,6 +1474,8 @@ Here's an example:
-- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor' -- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor'
-- EXCLUDING TABLE NAMES MATCHING ~<ory> -- EXCLUDING TABLE NAMES MATCHING ~<ory>
-- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8 -- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
-- ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
-- ALTER TABLE NAMES MATCHING ~/_list$/ SET SCHEMA 'mv'
BEFORE LOAD DO BEFORE LOAD DO
$$ create schema if not exists sakila; $$; $$ create schema if not exists sakila; $$;
@ -1793,6 +1795,28 @@ The `database` command accepts the following clauses and options:
You can use as many such rules as you need, all with possibly different You can use as many such rules as you need, all with possibly different
encodings. encodings.
- *ALTER TABLE NAMES MATCHING*
Introduce a comma separated list of table names or *regular expressions*
that you want to target in the pgloader *ALTER TABLE* command. The only
two available actions are *SET SCHEMA* and *RENAME TO*, both take a
quoted string as parameter:
ALTER TABLE NAMES MATCHING ~/_list$/, 'sales_by_store', ~/sales_by/
SET SCHEMA 'mv'
ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
You can use as many such rules as you need. The list of tables to be
migrated is searched in pgloader memory against the *ALTER TABLE*
matching rules, and for each command pgloader stops at the first
matching criteria (regexp or string).
No *ALTER TABLE* command is sent to PostgreSQL, the modification happens
at the level of the pgloader in-memory representation of your source
database schema. In case of a name change, the mapping is kept and
reused in the *foreign key* and *index* support.
### LIMITATIONS ### LIMITATIONS
The `database` command currently only supports MySQL source database and has The `database` command currently only supports MySQL source database and has

View File

@ -34,6 +34,7 @@
#:mssql ; M$ SQL connectivity #:mssql ; M$ SQL connectivity
#:uuid ; Transforming MS SQL unique identifiers #:uuid ; Transforming MS SQL unique identifiers
#:quri ; decode URI parameters #:quri ; decode URI parameters
#:cl-ppcre ; Perl Compatible Regular Expressions
) )
:components :components
((:module "src" ((:module "src"
@ -64,6 +65,7 @@
(:file "read-sql-files") (:file "read-sql-files")
(:file "quoting") (:file "quoting")
(:file "schema-structs" :depends-on ("quoting")) (:file "schema-structs" :depends-on ("quoting"))
(:file "alter-table" :depends-on ("schema-structs"))
;; State, monitoring, reporting ;; State, monitoring, reporting
(:file "reject" :depends-on ("state")) (:file "reject" :depends-on ("state"))
@ -114,6 +116,8 @@
(:file "command-copy") (:file "command-copy")
(:file "command-dbf") (:file "command-dbf")
(:file "command-cast-rules") (:file "command-cast-rules")
(:file "command-materialize-views")
(:file "command-alter-table")
(:file "command-mysql") (:file "command-mysql")
(:file "command-including-like") (:file "command-including-like")
(:file "command-mssql") (:file "command-mssql")

View File

@ -98,6 +98,7 @@
#:push-to-end #:push-to-end
#:with-schema #:with-schema
#:alter-table
#:format-table-name)) #:format-table-name))
@ -259,6 +260,7 @@
#:push-to-end #:push-to-end
#:with-schema #:with-schema
#:alter-table
#:format-table-name #:format-table-name
#:format-default-value #:format-default-value

View File

@ -0,0 +1,58 @@
;;;
;;; ALTER TABLE allows to change some of their properties while migrating
;;; from a source to PostgreSQL, currently only takes care of the schema.
;;;
(in-package #:pgloader.parser)
(defrule match-rule-target-regex quoted-regex)
(defrule match-rule-target-string quoted-namestring
(:lambda (x) (list :string x)))
(defrule match-rule-target (or match-rule-target-string
match-rule-target-regex))
(defrule another-match-rule-target (and comma match-rule-target)
(:lambda (x)
(bind (((_ target) x)) target)))
(defrule filter-list-matching
(and match-rule-target (* another-match-rule-target))
(:lambda (source)
(destructuring-bind (filter1 filters) source
(list* filter1 filters))))
(defrule alter-table-names-matching (and kw-alter kw-table kw-names kw-matching
filter-list-matching)
(:lambda (alter-table)
(bind (((_ _ _ _ match-rule-target-list) alter-table))
match-rule-target-list)))
(defrule rename-to (and kw-rename kw-to quoted-namestring)
(:lambda (stmt)
(bind (((_ _ new-name) stmt))
(list #'pgloader.schema::alter-table-rename new-name))))
(defrule set-schema (and kw-set kw-schema quoted-namestring)
(:lambda (stmt)
(bind (((_ _ schema) stmt))
(list #'pgloader.schema::alter-table-set-schema schema))))
(defrule alter-table-action (or rename-to
set-schema))
(defrule alter-table-command (and alter-table-names-matching alter-table-action)
(:lambda (alter-table-command)
(destructuring-bind (match-rule-target-list action) alter-table-command
(loop :for match-rule-target :in match-rule-target-list
:collect (pgloader.schema::make-match-rule
:type (first match-rule-target)
:target (second match-rule-target)
:action (first action)
:args (rest action))))))
(defrule alter-table (+ (and alter-table-command ignore-whitespace))
(:lambda (alter-table-command-list)
(cons :alter-table
(loop :for (command ws) :in alter-table-command-list
:collect command))))

View File

@ -100,7 +100,9 @@
(def-keyword-rule "schemas") (def-keyword-rule "schemas")
(def-keyword-rule "only") (def-keyword-rule "only")
(def-keyword-rule "drop") (def-keyword-rule "drop")
(def-keyword-rule "alter")
(def-keyword-rule "create") (def-keyword-rule "create")
(def-keyword-rule "rename")
(def-keyword-rule "materialize") (def-keyword-rule "materialize")
(def-keyword-rule "reset") (def-keyword-rule "reset")
(def-keyword-rule "table") (def-keyword-rule "table")

View File

@ -0,0 +1,39 @@
;;;
;;; Materialize views by copying their data over, allows for doing advanced
;;; ETL processing by having parts of the processing happen on the MySQL
;;; query side.
;;;
(in-package #:pgloader.parser)
(defrule view-name (and (alpha-char-p character)
(* (or (alpha-char-p character)
(digit-char-p character)
#\_)))
(:text t))
(defrule view-sql (and kw-as dollar-quoted)
(:destructure (as sql) (declare (ignore as)) sql))
(defrule view-definition (and view-name (? view-sql))
(:destructure (name sql) (cons name sql)))
(defrule another-view-definition (and comma view-definition)
(:lambda (source)
(bind (((_ view) source)) view)))
(defrule views-list (and view-definition (* another-view-definition))
(:lambda (vlist)
(destructuring-bind (view1 views) vlist
(list* view1 views))))
(defrule materialize-all-views (and kw-materialize kw-all kw-views)
(:constant :all))
(defrule materialize-view-list (and kw-materialize kw-views views-list)
(:destructure (mat views list) (declare (ignore mat views)) list))
(defrule materialize-views (or materialize-view-list materialize-all-views)
(:lambda (views)
(cons :views views)))

View File

@ -28,59 +28,10 @@
(and mysql-option (* (and comma mysql-option)))) (and mysql-option (* (and comma mysql-option))))
(:function flatten-option-list)) (:function flatten-option-list))
;;;
;;; Materialize views by copying their data over, allows for doing advanced
;;; ETL processing by having parts of the processing happen on the MySQL
;;; query side.
;;;
(defrule view-name (and (alpha-char-p character)
(* (or (alpha-char-p character)
(digit-char-p character)
#\_)))
(:text t))
(defrule view-sql (and kw-as dollar-quoted)
(:destructure (as sql) (declare (ignore as)) sql))
(defrule view-definition (and view-name (? view-sql))
(:destructure (name sql) (cons name sql)))
(defrule another-view-definition (and comma view-definition)
(:lambda (source)
(bind (((_ view) source)) view)))
(defrule views-list (and view-definition (* another-view-definition))
(:lambda (vlist)
(destructuring-bind (view1 views) vlist
(list* view1 views))))
(defrule materialize-all-views (and kw-materialize kw-all kw-views)
(:constant :all))
(defrule materialize-view-list (and kw-materialize kw-views views-list)
(:destructure (mat views list) (declare (ignore mat views)) list))
(defrule materialize-views (or materialize-view-list materialize-all-views)
(:lambda (views)
(cons :views views)))
;;; ;;;
;;; Including only some tables or excluding some others ;;; Including only some tables or excluding some others
;;; ;;;
(defrule namestring-or-regex (or quoted-namestring quoted-regex))
(defrule another-namestring-or-regex (and comma namestring-or-regex)
(:lambda (source)
(bind (((_ re) source)) re)))
(defrule filter-list-matching
(and namestring-or-regex (* another-namestring-or-regex))
(:lambda (source)
(destructuring-bind (filter1 filters) source
(list* filter1 filters))))
(defrule including-matching (defrule including-matching
(and kw-including kw-only kw-table kw-names kw-matching filter-list-matching) (and kw-including kw-only kw-table kw-names kw-matching filter-list-matching)
(:lambda (source) (:lambda (source)
@ -115,6 +66,7 @@
(defrule load-mysql-optional-clauses (* (or mysql-options (defrule load-mysql-optional-clauses (* (or mysql-options
gucs gucs
casts casts
alter-table
materialize-views materialize-views
including-matching including-matching
excluding-matching excluding-matching
@ -186,6 +138,7 @@
(defun lisp-code-for-loading-from-mysql (my-db-conn pg-db-conn (defun lisp-code-for-loading-from-mysql (my-db-conn pg-db-conn
&key &key
gucs casts views before after options gucs casts views before after options
alter-table
((:including incl)) ((:including incl))
((:excluding excl)) ((:excluding excl))
((:decoding decoding-as))) ((:decoding decoding-as)))
@ -207,6 +160,7 @@
:including ',incl :including ',incl
:excluding ',excl :excluding ',excl
:materialize-views ',views :materialize-views ',views
:alter-table ',alter-table
:set-table-oids t :set-table-oids t
,@(remove-batch-control-option options)) ,@(remove-batch-control-option options))
@ -218,7 +172,7 @@
pg-db-uri pg-db-uri
&key &key
gucs casts views before after gucs casts views before after
options including excluding decoding) options alter-table including excluding decoding)
source source
(cond (*dry-run* (cond (*dry-run*
(lisp-code-for-mysql-dry-run my-db-uri pg-db-uri)) (lisp-code-for-mysql-dry-run my-db-uri pg-db-uri))
@ -230,6 +184,7 @@
:before before :before before
:after after :after after
:options options :options options
:alter-table alter-table
:including including :including including
:excluding excluding :excluding excluding
:decoding decoding)))))) :decoding decoding))))))

View File

@ -51,3 +51,4 @@
(defrule trimmed-name (and ignore-whitespace name) (defrule trimmed-name (and ignore-whitespace name)
(:destructure (whitespace name) (declare (ignore whitespace)) name)) (:destructure (whitespace name) (declare (ignore whitespace)) name))
(defrule namestring-or-regex (or quoted-namestring quoted-regex))

View File

@ -301,7 +301,6 @@ select i.relname,
:collect (make-pgsql-index :name name :collect (make-pgsql-index :name name
:schema schema :schema schema
:table-name table-name
:table-oid table-oid :table-oid table-oid
:primary primary :primary primary
:unique unique :unique unique

View File

@ -44,7 +44,7 @@
;;; API for Foreign Keys ;;; API for Foreign Keys
;;; ;;;
(defstruct pgsql-fkey (defstruct pgsql-fkey
name table-name columns foreign-table foreign-columns update-rule delete-rule) name table columns foreign-table foreign-columns update-rule delete-rule)
(defgeneric format-pgsql-create-fkey (fkey) (defgeneric format-pgsql-create-fkey (fkey)
(:documentation (:documentation
@ -58,10 +58,10 @@
"Generate the PostgreSQL statement to rebuild a MySQL Foreign Key" "Generate the PostgreSQL statement to rebuild a MySQL Foreign Key"
(format nil (format nil
"ALTER TABLE ~a ADD CONSTRAINT ~a FOREIGN KEY(~{~a~^,~}) REFERENCES ~a(~{~a~^,~})~:[~*~; ON UPDATE ~a~]~:[~*~; ON DELETE ~a~]" "ALTER TABLE ~a ADD CONSTRAINT ~a FOREIGN KEY(~{~a~^,~}) REFERENCES ~a(~{~a~^,~})~:[~*~; ON UPDATE ~a~]~:[~*~; ON DELETE ~a~]"
(pgsql-fkey-table-name fk) (format-table-name (pgsql-fkey-table fk))
(pgsql-fkey-name fk) ; constraint name (pgsql-fkey-name fk) ; constraint name
(pgsql-fkey-columns fk) (pgsql-fkey-columns fk)
(pgsql-fkey-foreign-table fk) (format-table-name (pgsql-fkey-foreign-table fk))
(pgsql-fkey-foreign-columns fk) (pgsql-fkey-foreign-columns fk)
(pgsql-fkey-update-rule fk) (pgsql-fkey-update-rule fk)
(pgsql-fkey-update-rule fk) (pgsql-fkey-update-rule fk)
@ -71,7 +71,7 @@
(defmethod format-pgsql-drop-fkey ((fk pgsql-fkey) &key all-pgsql-fkeys) (defmethod format-pgsql-drop-fkey ((fk pgsql-fkey) &key all-pgsql-fkeys)
"Generate the PostgreSQL statement to rebuild a MySQL Foreign Key" "Generate the PostgreSQL statement to rebuild a MySQL Foreign Key"
(let* ((constraint-name (apply-identifier-case (pgsql-fkey-name fk))) (let* ((constraint-name (apply-identifier-case (pgsql-fkey-name fk)))
(table-name (apply-identifier-case (pgsql-fkey-table-name fk))) (table-name (format-table-name (pgsql-fkey-table fk)))
(fkeys (cdr (assoc table-name all-pgsql-fkeys :test #'string=))) (fkeys (cdr (assoc table-name all-pgsql-fkeys :test #'string=)))
(fkey-exists (member constraint-name fkeys :test #'string=))) (fkey-exists (member constraint-name fkeys :test #'string=)))
(when fkey-exists (when fkey-exists
@ -230,7 +230,7 @@
;; the struct is used both for supporting new index creation from non ;; the struct is used both for supporting new index creation from non
;; PostgreSQL system and for drop/create indexes when using the 'drop ;; PostgreSQL system and for drop/create indexes when using the 'drop
;; indexes' option (in CSV mode and the like) ;; indexes' option (in CSV mode and the like)
name schema table-name table-oid primary unique columns sql conname condef) name schema table-oid primary unique columns sql conname condef)
(defgeneric format-pgsql-create-index (table index) (defgeneric format-pgsql-create-index (table index)
(:documentation (:documentation

View File

@ -136,6 +136,7 @@
including including
excluding excluding
set-table-oids set-table-oids
alter-table
materialize-views) materialize-views)
"Export database source data and Import it into PostgreSQL" "Export database source data and Import it into PostgreSQL"
(let* ((copy-kernel (make-kernel worker-count)) (let* ((copy-kernel (make-kernel worker-count))
@ -166,6 +167,11 @@
;; cast the catalog into something PostgreSQL can work on ;; cast the catalog into something PostgreSQL can work on
(cast catalog) (cast catalog)
;; if asked, now alter the catalog with given rules: the alter-table
;; keyword parameter actually contains a set of alter table rules.
(when alter-table
(alter-table catalog alter-table))
;; if asked, first drop/create the tables on the PostgreSQL side ;; if asked, first drop/create the tables on the PostgreSQL side
(handler-case (handler-case
(cond ((and (or create-tables schema-only) (not data-only)) (cond ((and (or create-tables schema-only) (not data-only))
@ -200,10 +206,13 @@
(view-list catalog)) (view-list catalog))
:do (let ((table-source (instanciate-table-copy-object copy table))) :do (let ((table-source (instanciate-table-copy-object copy table)))
(log-message :debug "TARGET: ~a" (target table-source)) ;; that needs *print-circle* to true, and anyway it's too much
;; output in general.
;;
;; (log-message :debug "TARGET: ~a" (target table-source))
(log-message :debug "TRANSFORMS(~a): ~s" (log-message :debug "TRANSFORMS(~a): ~s"
(format-table-name table) (format-table-name table)
(mapcar #'column-transform (table-column-list table))) (transforms table-source))
;; first COPY the data from source to PostgreSQL, using copy-kernel ;; first COPY the data from source to PostgreSQL, using copy-kernel
(unless schema-only (unless schema-only

View File

@ -197,7 +197,6 @@ order by SchemaName,
(table (find-table schema table-name)) (table (find-table schema table-name))
(index (make-pgsql-index :name index-name (index (make-pgsql-index :name index-name
:primary (= pkey 1) :primary (= pkey 1)
:table-name (format-table-name table)
:unique (= unique 1) :unique (= unique 1)
:columns (list col)))) :columns (list col))))
(add-index table index)) (add-index table index))
@ -206,7 +205,7 @@ order by SchemaName,
(defun list-all-fkeys (catalog &key including excluding) (defun list-all-fkeys (catalog &key including excluding)
"Get the list of MSSQL index definitions per table." "Get the list of MSSQL index definitions per table."
(loop (loop
:for (fkey-name schema-name table-name col fschema ftable fcol) :for (fkey-name schema-name table-name col fschema-name ftable-name fcol)
:in (mssql-query (format nil " :in (mssql-query (format nil "
SELECT SELECT
REPLACE(KCU1.CONSTRAINT_NAME, '.', '_') AS 'CONSTRAINT_NAME' REPLACE(KCU1.CONSTRAINT_NAME, '.', '_') AS 'CONSTRAINT_NAME'
@ -252,17 +251,13 @@ ORDER BY KCU1.CONSTRAINT_NAME, KCU1.ORDINAL_POSITION"
:do :do
(let* ((schema (find-schema catalog schema-name)) (let* ((schema (find-schema catalog schema-name))
(table (find-table schema table-name)) (table (find-table schema table-name))
(fschema (find-schema catalog fschema-name))
(ftable (find-table fschema ftable-name))
(pg-fkey (pg-fkey
(make-pgsql-fkey :name fkey-name (make-pgsql-fkey :name fkey-name
:table-name (format-table-name table) :table table
:columns (list col) :columns (list col)
:foreign-table :foreign-table ftable
(format-table-name
;; for code re-use, create a table instance here.
(make-table
:source-name (cons fschema ftable)
:name (apply-identifier-case ftable)
:schema fschema))
:foreign-columns (list fcol))) :foreign-columns (list fcol)))
(fkey (maybe-add-fkey table fkey-name pg-fkey (fkey (maybe-add-fkey table fkey-name pg-fkey
:key #'pgloader.pgsql::pgsql-fkey-name))) :key #'pgloader.pgsql::pgsql-fkey-name)))

View File

@ -232,7 +232,6 @@ GROUP BY table_name, index_name;"
(index (index
(make-pgsql-index :name name ; further processing is needed (make-pgsql-index :name name ; further processing is needed
:primary (string= name "PRIMARY") :primary (string= name "PRIMARY")
:table-name (apply-identifier-case table-name)
:unique (not (string= "1" non-unique)) :unique (not (string= "1" non-unique))
:columns (mapcar :columns (mapcar
#'apply-identifier-case #'apply-identifier-case
@ -251,7 +250,7 @@ GROUP BY table_name, index_name;"
excluding) excluding)
"Get the list of MySQL Foreign Keys definitions per table." "Get the list of MySQL Foreign Keys definitions per table."
(loop (loop
:for (table-name name ftable cols fcols update-rule delete-rule) :for (table-name name ftable-name cols fcols update-rule delete-rule)
:in (mysql-query (format nil " :in (mysql-query (format nil "
SELECT tc.table_name, tc.constraint_name, k.referenced_table_name ft, SELECT tc.table_name, tc.constraint_name, k.referenced_table_name ft,
@ -290,13 +289,14 @@ GROUP BY table_name, index_name;"
(filter-list-to-where-clause including) (filter-list-to-where-clause including)
excluding ; do we print the clause? excluding ; do we print the clause?
(filter-list-to-where-clause excluding t))) (filter-list-to-where-clause excluding t)))
:do (let ((table (find-table schema table-name)) :do (let* ((table (find-table schema table-name))
(ftable (find-table schema ftable-name))
(fk (fk
(make-pgsql-fkey :name (apply-identifier-case name) (make-pgsql-fkey :name (apply-identifier-case name)
:table-name (apply-identifier-case table-name) :table table
:columns (mapcar #'apply-identifier-case :columns (mapcar #'apply-identifier-case
(sq:split-sequence #\, cols)) (sq:split-sequence #\, cols))
:foreign-table (apply-identifier-case ftable) :foreign-table ftable
:foreign-columns (mapcar :foreign-columns (mapcar
#'apply-identifier-case #'apply-identifier-case
(sq:split-sequence #\, fcols)) (sq:split-sequence #\, fcols))

View File

@ -124,7 +124,6 @@
:when sql :when sql
:do (let ((table (find-table schema table-name)) :do (let ((table (find-table schema table-name))
(idxdef (make-sqlite-idx :name index-name (idxdef (make-sqlite-idx :name index-name
:table-name table-name
:sql sql))) :sql sql)))
(add-index table idxdef)) (add-index table idxdef))
:finally (return schema)))) :finally (return schema))))

View File

@ -0,0 +1,71 @@
;;;
;;; ALTER TABLE allows pgloader to apply transformations on the catalog
;;; retrieved before applying it to PostgreSQL: SET SCHEMA, RENAME, etc.
;;;
(in-package :pgloader.schema)
#|
See src/parsers/command-alter-table.lisp
(make-match-rule :type :regex
:target "_list$"
:action #'pgloader.schema::alter-table-set-schema
:args (list "mv"))
|#
(defstruct match-rule type target action args)
(defgeneric alter-table (object alter-table-rule-list))
(defmethod alter-table ((catalog catalog) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to all schema of CATALOG."
(loop :for schema :in (catalog-schema-list catalog)
:do (alter-table schema alter-table-rule-list)))
(defmethod alter-table ((schema schema) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to all tables and views of SCHEMA."
(loop :for table :in (append (schema-table-list schema)
(schema-view-list schema))
:do (alter-table table alter-table-rule-list)))
(defmethod alter-table ((table table) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to TABLE."
;;
;; alter-table-rule-list is a list of set of rules, within each set we
;; only apply the first rules that matches.
;;
(loop :for rule-list :in alter-table-rule-list
:do (let ((match-rule
(loop :for match-rule :in rule-list
:thereis (when (rule-matches match-rule table)
match-rule))))
(when match-rule
(apply (match-rule-action match-rule)
(list* table (match-rule-args match-rule)))))))
;;;
;;; ALTER TABLE actions: functions that take a table and arguments and apply
;;; the altering action wanted to them.
;;;
(defun alter-table-set-schema (table schema-name)
"Alter the schema of TABLE, set SCHEMA-NAME instead."
(setf (table-schema table) schema-name))
(defun alter-table-rename (table new-name)
"Alter the name of TABLE to NEW-NAME."
(setf (table-name table) new-name))
;;;
;;; Apply the match rules as given by the parser to a table name.
;;;
(declaim (inline rule-matches))
(defun rule-matches (match-rule table)
"Return non-nil when TABLE matches given MATCH-RULE."
(declare (type match-rule match-rule) (type table table))
(let ((table-name (table-source-name table)))
(ecase (match-rule-type match-rule)
(:string (string= (match-rule-target match-rule) table-name))
(:regex (cl-ppcre:scan (match-rule-target match-rule) table-name)))))

View File

@ -21,10 +21,17 @@ load database
-- MATERIALIZE VIEWS film_list, staff_list -- MATERIALIZE VIEWS film_list, staff_list
MATERIALIZE ALL VIEWS MATERIALIZE ALL VIEWS
ALTER TABLE NAMES MATCHING ~/_list$/, 'sales_by_store', ~/sales_by/
SET SCHEMA 'mv'
ALTER TABLE NAMES MATCHING 'sales_by_store' RENAME TO 'sales_by_store_list'
ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
-- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor' -- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor'
-- EXCLUDING TABLE NAMES MATCHING ~<ory> -- EXCLUDING TABLE NAMES MATCHING ~<ory>
BEFORE LOAD DO BEFORE LOAD DO
$$ create schema if not exists sakila; $$, $$ create schema if not exists sakila; $$,
$$ alter database sakila set search_path to sakila, public; $$; $$ create schema if not exists mv; $$,
$$ alter database sakila set search_path to sakila, mv, public; $$;