Implement ALTER TABLE clause for MySQL migrations.

The new ALTER TABLE facility allows to act on tables found in the MySQL
database before the migration happens. In this patch the only provided
actions are RENAME TO and SET SCHEMA, which fixes #224.

In order to be able to provide the same option for MS SQL users, we will
have to make it work at the SCHEMA level (ALTER SCHEMA ... RENAME TO
...) and modify the internal schema-struct so that the schema slot of
our table instances are a schema instance rather than its name.

Lacking MS SQL test database and instance, the facility is not yet
provided for that source type.
This commit is contained in:
Dimitri Fontaine 2016-03-06 21:40:37 +01:00
parent d4737a39ca
commit c724018840
18 changed files with 287 additions and 95 deletions

4
debian/control vendored
View File

@ -3,7 +3,7 @@ Section: database
Priority: extra
Maintainer: Dimitri Fontaine <dim@tapoueh.org>
Uploaders: Christoph Berg <myon@debian.org>
Build-Depends: debhelper (>= 8.0.0), sbcl (>= 1.1.13), ruby-ronn, buildapp (>= 1.5), cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-ixf, gawk, cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-mssql, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities
Build-Depends: debhelper (>= 8.0.0), sbcl (>= 1.1.13), ruby-ronn, buildapp (>= 1.5), cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-ixf, gawk, cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-mssql, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities, cl-ppcre
Standards-Version: 3.9.6
Homepage: https://github.com/dimitri/pgloader
Vcs-Git: https://github.com/dimitri/pgloader.git
@ -28,7 +28,7 @@ Description: extract, transform and load data into PostgreSQL
Package: cl-pgloader
Architecture: all
Depends: ${misc:Depends}, cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities
Depends: ${misc:Depends}, cl-asdf (>= 3.0.3), cl-log, cl-postmodern, cl-simple-date, cl-qmynd, cl-split-sequence, cl-unicode, cl-interpol, cl-csv, cl-fad, cl-lparallel, cl-esrap, cl-alexandria, cl-drakma, cl-flexi-streams, cl-usocket, cl-local-time, cl-command-line-arguments, cl-abnf, cl-db3, cl-py-configparser, cl-sqlite, cl-trivial-backtrace, cl-markdown, cl-md5, cl-asdf-finalizers, cl-asdf-system-connections, cl-cffi (>= 1:0.12.0), cl-bordeaux-threads (>= 0.8.3), cl-metabang-bind, cl-uuid, cl-trivial-utf-8, cl-quri, cl-utilities, cl-ppcre
Description: extract, transform and load data into PostgreSQL
pgloader imports data from different kind of sources and COPY it into
PostgreSQL.

View File

@ -1725,6 +1725,8 @@ LOAD DATABASE
\-\- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, \'actor\'
\-\- EXCLUDING TABLE NAMES MATCHING ~<ory>
\-\- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
\-\- ALTER TABLE NAMES MATCHING \'film\' RENAME TO \'films\'
\-\- ALTER TABLE NAMES MATCHING ~/_list$/ SET SCHEMA \'mv\'
BEFORE LOAD DO
$$ create schema if not exists sakila; $$;
@ -2104,6 +2106,31 @@ DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
.IP
You can use as many such rules as you need, all with possibly different encodings\.
.
.IP "\(bu" 4
\fIALTER TABLE NAMES MATCHING\fR
.
.IP
Introduce a comma separated list of table names or \fIregular expressions\fR that you want to target in the pgloader \fIALTER TABLE\fR command\. The only two available actions are \fISET SCHEMA\fR and \fIRENAME TO\fR, both take a quoted string as parameter:
.
.IP "" 4
.
.nf
ALTER TABLE NAMES MATCHING ~/_list$/, \'sales_by_store\', ~/sales_by/
SET SCHEMA \'mv\'
ALTER TABLE NAMES MATCHING \'film\' RENAME TO \'films\'
.
.fi
.
.IP "" 0
.
.IP
You can use as many such rules as you need\. The list of tables to be migrated is searched in pgloader memory against the \fIALTER TABLE\fR matching rules, and for each command pgloader stops at the first matching criteria (regexp or string)\.
.
.IP
No \fIALTER TABLE\fR command is sent to PostgreSQL, the modification happens at the level of the pgloader in\-memory representation of your source database schema\. In case of a name change, the mapping is kept and reused in the \fIforeign key\fR and \fIindex\fR support\.
.
.IP "" 0
.
.SS "LIMITATIONS"

View File

@ -1474,6 +1474,8 @@ Here's an example:
-- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor'
-- EXCLUDING TABLE NAMES MATCHING ~<ory>
-- DECODING TABLE NAMES MATCHING ~/messed/, ~/encoding/ AS utf8
-- ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
-- ALTER TABLE NAMES MATCHING ~/_list$/ SET SCHEMA 'mv'
BEFORE LOAD DO
$$ create schema if not exists sakila; $$;
@ -1793,6 +1795,28 @@ The `database` command accepts the following clauses and options:
You can use as many such rules as you need, all with possibly different
encodings.
- *ALTER TABLE NAMES MATCHING*
Introduce a comma separated list of table names or *regular expressions*
that you want to target in the pgloader *ALTER TABLE* command. The only
two available actions are *SET SCHEMA* and *RENAME TO*, both take a
quoted string as parameter:
ALTER TABLE NAMES MATCHING ~/_list$/, 'sales_by_store', ~/sales_by/
SET SCHEMA 'mv'
ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
You can use as many such rules as you need. The list of tables to be
migrated is searched in pgloader memory against the *ALTER TABLE*
matching rules, and for each command pgloader stops at the first
matching criteria (regexp or string).
No *ALTER TABLE* command is sent to PostgreSQL, the modification happens
at the level of the pgloader in-memory representation of your source
database schema. In case of a name change, the mapping is kept and
reused in the *foreign key* and *index* support.
### LIMITATIONS
The `database` command currently only supports MySQL source database and has

View File

@ -34,6 +34,7 @@
#:mssql ; M$ SQL connectivity
#:uuid ; Transforming MS SQL unique identifiers
#:quri ; decode URI parameters
#:cl-ppcre ; Perl Compatible Regular Expressions
)
:components
((:module "src"
@ -64,6 +65,7 @@
(:file "read-sql-files")
(:file "quoting")
(:file "schema-structs" :depends-on ("quoting"))
(:file "alter-table" :depends-on ("schema-structs"))
;; State, monitoring, reporting
(:file "reject" :depends-on ("state"))
@ -114,6 +116,8 @@
(:file "command-copy")
(:file "command-dbf")
(:file "command-cast-rules")
(:file "command-materialize-views")
(:file "command-alter-table")
(:file "command-mysql")
(:file "command-including-like")
(:file "command-mssql")

View File

@ -98,6 +98,7 @@
#:push-to-end
#:with-schema
#:alter-table
#:format-table-name))
@ -259,6 +260,7 @@
#:push-to-end
#:with-schema
#:alter-table
#:format-table-name
#:format-default-value

View File

@ -0,0 +1,58 @@
;;;
;;; ALTER TABLE allows to change some of their properties while migrating
;;; from a source to PostgreSQL, currently only takes care of the schema.
;;;
(in-package #:pgloader.parser)
(defrule match-rule-target-regex quoted-regex)
(defrule match-rule-target-string quoted-namestring
(:lambda (x) (list :string x)))
(defrule match-rule-target (or match-rule-target-string
match-rule-target-regex))
(defrule another-match-rule-target (and comma match-rule-target)
(:lambda (x)
(bind (((_ target) x)) target)))
(defrule filter-list-matching
(and match-rule-target (* another-match-rule-target))
(:lambda (source)
(destructuring-bind (filter1 filters) source
(list* filter1 filters))))
(defrule alter-table-names-matching (and kw-alter kw-table kw-names kw-matching
filter-list-matching)
(:lambda (alter-table)
(bind (((_ _ _ _ match-rule-target-list) alter-table))
match-rule-target-list)))
(defrule rename-to (and kw-rename kw-to quoted-namestring)
(:lambda (stmt)
(bind (((_ _ new-name) stmt))
(list #'pgloader.schema::alter-table-rename new-name))))
(defrule set-schema (and kw-set kw-schema quoted-namestring)
(:lambda (stmt)
(bind (((_ _ schema) stmt))
(list #'pgloader.schema::alter-table-set-schema schema))))
(defrule alter-table-action (or rename-to
set-schema))
(defrule alter-table-command (and alter-table-names-matching alter-table-action)
(:lambda (alter-table-command)
(destructuring-bind (match-rule-target-list action) alter-table-command
(loop :for match-rule-target :in match-rule-target-list
:collect (pgloader.schema::make-match-rule
:type (first match-rule-target)
:target (second match-rule-target)
:action (first action)
:args (rest action))))))
(defrule alter-table (+ (and alter-table-command ignore-whitespace))
(:lambda (alter-table-command-list)
(cons :alter-table
(loop :for (command ws) :in alter-table-command-list
:collect command))))

View File

@ -100,7 +100,9 @@
(def-keyword-rule "schemas")
(def-keyword-rule "only")
(def-keyword-rule "drop")
(def-keyword-rule "alter")
(def-keyword-rule "create")
(def-keyword-rule "rename")
(def-keyword-rule "materialize")
(def-keyword-rule "reset")
(def-keyword-rule "table")

View File

@ -0,0 +1,39 @@
;;;
;;; Materialize views by copying their data over, allows for doing advanced
;;; ETL processing by having parts of the processing happen on the MySQL
;;; query side.
;;;
(in-package #:pgloader.parser)
(defrule view-name (and (alpha-char-p character)
(* (or (alpha-char-p character)
(digit-char-p character)
#\_)))
(:text t))
(defrule view-sql (and kw-as dollar-quoted)
(:destructure (as sql) (declare (ignore as)) sql))
(defrule view-definition (and view-name (? view-sql))
(:destructure (name sql) (cons name sql)))
(defrule another-view-definition (and comma view-definition)
(:lambda (source)
(bind (((_ view) source)) view)))
(defrule views-list (and view-definition (* another-view-definition))
(:lambda (vlist)
(destructuring-bind (view1 views) vlist
(list* view1 views))))
(defrule materialize-all-views (and kw-materialize kw-all kw-views)
(:constant :all))
(defrule materialize-view-list (and kw-materialize kw-views views-list)
(:destructure (mat views list) (declare (ignore mat views)) list))
(defrule materialize-views (or materialize-view-list materialize-all-views)
(:lambda (views)
(cons :views views)))

View File

@ -28,59 +28,10 @@
(and mysql-option (* (and comma mysql-option))))
(:function flatten-option-list))
;;;
;;; Materialize views by copying their data over, allows for doing advanced
;;; ETL processing by having parts of the processing happen on the MySQL
;;; query side.
;;;
(defrule view-name (and (alpha-char-p character)
(* (or (alpha-char-p character)
(digit-char-p character)
#\_)))
(:text t))
(defrule view-sql (and kw-as dollar-quoted)
(:destructure (as sql) (declare (ignore as)) sql))
(defrule view-definition (and view-name (? view-sql))
(:destructure (name sql) (cons name sql)))
(defrule another-view-definition (and comma view-definition)
(:lambda (source)
(bind (((_ view) source)) view)))
(defrule views-list (and view-definition (* another-view-definition))
(:lambda (vlist)
(destructuring-bind (view1 views) vlist
(list* view1 views))))
(defrule materialize-all-views (and kw-materialize kw-all kw-views)
(:constant :all))
(defrule materialize-view-list (and kw-materialize kw-views views-list)
(:destructure (mat views list) (declare (ignore mat views)) list))
(defrule materialize-views (or materialize-view-list materialize-all-views)
(:lambda (views)
(cons :views views)))
;;;
;;; Including only some tables or excluding some others
;;;
(defrule namestring-or-regex (or quoted-namestring quoted-regex))
(defrule another-namestring-or-regex (and comma namestring-or-regex)
(:lambda (source)
(bind (((_ re) source)) re)))
(defrule filter-list-matching
(and namestring-or-regex (* another-namestring-or-regex))
(:lambda (source)
(destructuring-bind (filter1 filters) source
(list* filter1 filters))))
(defrule including-matching
(and kw-including kw-only kw-table kw-names kw-matching filter-list-matching)
(:lambda (source)
@ -115,6 +66,7 @@
(defrule load-mysql-optional-clauses (* (or mysql-options
gucs
casts
alter-table
materialize-views
including-matching
excluding-matching
@ -186,6 +138,7 @@
(defun lisp-code-for-loading-from-mysql (my-db-conn pg-db-conn
&key
gucs casts views before after options
alter-table
((:including incl))
((:excluding excl))
((:decoding decoding-as)))
@ -207,6 +160,7 @@
:including ',incl
:excluding ',excl
:materialize-views ',views
:alter-table ',alter-table
:set-table-oids t
,@(remove-batch-control-option options))
@ -218,7 +172,7 @@
pg-db-uri
&key
gucs casts views before after
options including excluding decoding)
options alter-table including excluding decoding)
source
(cond (*dry-run*
(lisp-code-for-mysql-dry-run my-db-uri pg-db-uri))
@ -230,6 +184,7 @@
:before before
:after after
:options options
:alter-table alter-table
:including including
:excluding excluding
:decoding decoding))))))

View File

@ -51,3 +51,4 @@
(defrule trimmed-name (and ignore-whitespace name)
(:destructure (whitespace name) (declare (ignore whitespace)) name))
(defrule namestring-or-regex (or quoted-namestring quoted-regex))

View File

@ -301,7 +301,6 @@ select i.relname,
:collect (make-pgsql-index :name name
:schema schema
:table-name table-name
:table-oid table-oid
:primary primary
:unique unique

View File

@ -44,7 +44,7 @@
;;; API for Foreign Keys
;;;
(defstruct pgsql-fkey
name table-name columns foreign-table foreign-columns update-rule delete-rule)
name table columns foreign-table foreign-columns update-rule delete-rule)
(defgeneric format-pgsql-create-fkey (fkey)
(:documentation
@ -58,10 +58,10 @@
"Generate the PostgreSQL statement to rebuild a MySQL Foreign Key"
(format nil
"ALTER TABLE ~a ADD CONSTRAINT ~a FOREIGN KEY(~{~a~^,~}) REFERENCES ~a(~{~a~^,~})~:[~*~; ON UPDATE ~a~]~:[~*~; ON DELETE ~a~]"
(pgsql-fkey-table-name fk)
(format-table-name (pgsql-fkey-table fk))
(pgsql-fkey-name fk) ; constraint name
(pgsql-fkey-columns fk)
(pgsql-fkey-foreign-table fk)
(format-table-name (pgsql-fkey-foreign-table fk))
(pgsql-fkey-foreign-columns fk)
(pgsql-fkey-update-rule fk)
(pgsql-fkey-update-rule fk)
@ -71,9 +71,9 @@
(defmethod format-pgsql-drop-fkey ((fk pgsql-fkey) &key all-pgsql-fkeys)
"Generate the PostgreSQL statement to rebuild a MySQL Foreign Key"
(let* ((constraint-name (apply-identifier-case (pgsql-fkey-name fk)))
(table-name (apply-identifier-case (pgsql-fkey-table-name fk)))
(fkeys (cdr (assoc table-name all-pgsql-fkeys :test #'string=)))
(fkey-exists (member constraint-name fkeys :test #'string=)))
(table-name (format-table-name (pgsql-fkey-table fk)))
(fkeys (cdr (assoc table-name all-pgsql-fkeys :test #'string=)))
(fkey-exists (member constraint-name fkeys :test #'string=)))
(when fkey-exists
;; we could do that without all-pgsql-fkeys in 9.2 and following with:
;; alter table if exists ... drop constraint if exists ...
@ -230,7 +230,7 @@
;; the struct is used both for supporting new index creation from non
;; PostgreSQL system and for drop/create indexes when using the 'drop
;; indexes' option (in CSV mode and the like)
name schema table-name table-oid primary unique columns sql conname condef)
name schema table-oid primary unique columns sql conname condef)
(defgeneric format-pgsql-create-index (table index)
(:documentation

View File

@ -136,6 +136,7 @@
including
excluding
set-table-oids
alter-table
materialize-views)
"Export database source data and Import it into PostgreSQL"
(let* ((copy-kernel (make-kernel worker-count))
@ -166,6 +167,11 @@
;; cast the catalog into something PostgreSQL can work on
(cast catalog)
;; if asked, now alter the catalog with given rules: the alter-table
;; keyword parameter actually contains a set of alter table rules.
(when alter-table
(alter-table catalog alter-table))
;; if asked, first drop/create the tables on the PostgreSQL side
(handler-case
(cond ((and (or create-tables schema-only) (not data-only))
@ -200,10 +206,13 @@
(view-list catalog))
:do (let ((table-source (instanciate-table-copy-object copy table)))
(log-message :debug "TARGET: ~a" (target table-source))
;; that needs *print-circle* to true, and anyway it's too much
;; output in general.
;;
;; (log-message :debug "TARGET: ~a" (target table-source))
(log-message :debug "TRANSFORMS(~a): ~s"
(format-table-name table)
(mapcar #'column-transform (table-column-list table)))
(transforms table-source))
;; first COPY the data from source to PostgreSQL, using copy-kernel
(unless schema-only

View File

@ -197,7 +197,6 @@ order by SchemaName,
(table (find-table schema table-name))
(index (make-pgsql-index :name index-name
:primary (= pkey 1)
:table-name (format-table-name table)
:unique (= unique 1)
:columns (list col))))
(add-index table index))
@ -206,7 +205,7 @@ order by SchemaName,
(defun list-all-fkeys (catalog &key including excluding)
"Get the list of MSSQL index definitions per table."
(loop
:for (fkey-name schema-name table-name col fschema ftable fcol)
:for (fkey-name schema-name table-name col fschema-name ftable-name fcol)
:in (mssql-query (format nil "
SELECT
REPLACE(KCU1.CONSTRAINT_NAME, '.', '_') AS 'CONSTRAINT_NAME'
@ -252,17 +251,13 @@ ORDER BY KCU1.CONSTRAINT_NAME, KCU1.ORDINAL_POSITION"
:do
(let* ((schema (find-schema catalog schema-name))
(table (find-table schema table-name))
(fschema (find-schema catalog fschema-name))
(ftable (find-table fschema ftable-name))
(pg-fkey
(make-pgsql-fkey :name fkey-name
:table-name (format-table-name table)
:table table
:columns (list col)
:foreign-table
(format-table-name
;; for code re-use, create a table instance here.
(make-table
:source-name (cons fschema ftable)
:name (apply-identifier-case ftable)
:schema fschema))
:foreign-table ftable
:foreign-columns (list fcol)))
(fkey (maybe-add-fkey table fkey-name pg-fkey
:key #'pgloader.pgsql::pgsql-fkey-name)))

View File

@ -232,7 +232,6 @@ GROUP BY table_name, index_name;"
(index
(make-pgsql-index :name name ; further processing is needed
:primary (string= name "PRIMARY")
:table-name (apply-identifier-case table-name)
:unique (not (string= "1" non-unique))
:columns (mapcar
#'apply-identifier-case
@ -251,7 +250,7 @@ GROUP BY table_name, index_name;"
excluding)
"Get the list of MySQL Foreign Keys definitions per table."
(loop
:for (table-name name ftable cols fcols update-rule delete-rule)
:for (table-name name ftable-name cols fcols update-rule delete-rule)
:in (mysql-query (format nil "
SELECT tc.table_name, tc.constraint_name, k.referenced_table_name ft,
@ -283,25 +282,26 @@ GROUP BY table_name, index_name;"
~:[~*~;and (~{tc.table_name ~a~^ and ~})~]
GROUP BY tc.table_name, tc.constraint_name, ft"
(db-name *connection*) (db-name *connection*)
only-tables ; do we print the clause?
only-tables
including ; do we print the clause?
(filter-list-to-where-clause including)
excluding ; do we print the clause?
(filter-list-to-where-clause excluding t)))
:do (let ((table (find-table schema table-name))
(fk
(make-pgsql-fkey :name (apply-identifier-case name)
:table-name (apply-identifier-case table-name)
:columns (mapcar #'apply-identifier-case
(sq:split-sequence #\, cols))
:foreign-table (apply-identifier-case ftable)
:foreign-columns (mapcar
#'apply-identifier-case
(sq:split-sequence #\, fcols))
:update-rule update-rule
:delete-rule delete-rule)))
(db-name *connection*) (db-name *connection*)
only-tables ; do we print the clause?
only-tables
including ; do we print the clause?
(filter-list-to-where-clause including)
excluding ; do we print the clause?
(filter-list-to-where-clause excluding t)))
:do (let* ((table (find-table schema table-name))
(ftable (find-table schema ftable-name))
(fk
(make-pgsql-fkey :name (apply-identifier-case name)
:table table
:columns (mapcar #'apply-identifier-case
(sq:split-sequence #\, cols))
:foreign-table ftable
:foreign-columns (mapcar
#'apply-identifier-case
(sq:split-sequence #\, fcols))
:update-rule update-rule
:delete-rule delete-rule)))
(add-fkey table fk))
:finally
(return schema)))

View File

@ -124,7 +124,6 @@
:when sql
:do (let ((table (find-table schema table-name))
(idxdef (make-sqlite-idx :name index-name
:table-name table-name
:sql sql)))
(add-index table idxdef))
:finally (return schema))))

View File

@ -0,0 +1,71 @@
;;;
;;; ALTER TABLE allows pgloader to apply transformations on the catalog
;;; retrieved before applying it to PostgreSQL: SET SCHEMA, RENAME, etc.
;;;
(in-package :pgloader.schema)
#|
See src/parsers/command-alter-table.lisp
(make-match-rule :type :regex
:target "_list$"
:action #'pgloader.schema::alter-table-set-schema
:args (list "mv"))
|#
(defstruct match-rule type target action args)
(defgeneric alter-table (object alter-table-rule-list))
(defmethod alter-table ((catalog catalog) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to all schema of CATALOG."
(loop :for schema :in (catalog-schema-list catalog)
:do (alter-table schema alter-table-rule-list)))
(defmethod alter-table ((schema schema) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to all tables and views of SCHEMA."
(loop :for table :in (append (schema-table-list schema)
(schema-view-list schema))
:do (alter-table table alter-table-rule-list)))
(defmethod alter-table ((table table) alter-table-rule-list)
"Apply ALTER-TABLE-RULE-LIST to TABLE."
;;
;; alter-table-rule-list is a list of set of rules, within each set we
;; only apply the first rules that matches.
;;
(loop :for rule-list :in alter-table-rule-list
:do (let ((match-rule
(loop :for match-rule :in rule-list
:thereis (when (rule-matches match-rule table)
match-rule))))
(when match-rule
(apply (match-rule-action match-rule)
(list* table (match-rule-args match-rule)))))))
;;;
;;; ALTER TABLE actions: functions that take a table and arguments and apply
;;; the altering action wanted to them.
;;;
(defun alter-table-set-schema (table schema-name)
"Alter the schema of TABLE, set SCHEMA-NAME instead."
(setf (table-schema table) schema-name))
(defun alter-table-rename (table new-name)
"Alter the name of TABLE to NEW-NAME."
(setf (table-name table) new-name))
;;;
;;; Apply the match rules as given by the parser to a table name.
;;;
(declaim (inline rule-matches))
(defun rule-matches (match-rule table)
"Return non-nil when TABLE matches given MATCH-RULE."
(declare (type match-rule match-rule) (type table table))
(let ((table-name (table-source-name table)))
(ecase (match-rule-type match-rule)
(:string (string= (match-rule-target match-rule) table-name))
(:regex (cl-ppcre:scan (match-rule-target match-rule) table-name)))))

View File

@ -21,10 +21,17 @@ load database
-- MATERIALIZE VIEWS film_list, staff_list
MATERIALIZE ALL VIEWS
ALTER TABLE NAMES MATCHING ~/_list$/, 'sales_by_store', ~/sales_by/
SET SCHEMA 'mv'
ALTER TABLE NAMES MATCHING 'sales_by_store' RENAME TO 'sales_by_store_list'
ALTER TABLE NAMES MATCHING 'film' RENAME TO 'films'
-- INCLUDING ONLY TABLE NAMES MATCHING ~/film/, 'actor'
-- EXCLUDING TABLE NAMES MATCHING ~<ory>
BEFORE LOAD DO
$$ create schema if not exists sakila; $$,
$$ alter database sakila set search_path to sakila, public; $$;
$$ create schema if not exists mv; $$,
$$ alter database sakila set search_path to sakila, mv, public; $$;