-
transmogrify.ploneremote-1.3-4.lbn19.noarch
transmogrifier.ploneremote is package of transmogrifier blueprints for uploading content via Zope XML-RPC API to a Plone site.
Plone site does not need any modifications, but vanilla Zope XML-RPC is used.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 19
-
transmogrify.print-0.5.0-1.lbn13.noarch
Transmogrifier blueprint to print pipeline item keys
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 13
-
transmogrify.print-0.6.0-1.lbn19.noarch
Note
As of version 1.3 Transmogrifier provides a similar feature, via a blueprint called: collective.transmogrifier.sections.logger.
This Transmogrifier blueprint is based on collective.transmogrifier.sections.tests.PrettyPrinter, which anyone can use in their project by creating a utility like so:
<utility
component="collective.transmogrifier.sections.tests.PrettyPrinter"
name="print" />
Then adding a section to your pipeline like so:
[transmogrifier]
pipeline =
…
print
[print]
blueprint = print
transmogrify.print has has two advantages over the above approach:
It adds the utility for you
It allows you to specify a keys parameter to print individual keys. If no key is provided, it prints the entire item.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 19
-
transmogrify.regexp-0.5.0-1.lbn13.noarch
transmogrify.regexp allows you to use regular expressions and format strings to search and replace key values in a transmogrifier pipeline.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 13
-
transmogrify.regexp-0.5.0-1.lbn19.noarch
transmogrify.regexp allows you to use regular expressions and format strings to search and replace key values in a transmogrifier pipeline.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 19
-
transmogrify.siteanalyser-1.3-2.lbn13.noarch
Transmogrifier blueprints that look at how html items are linked to gather metadata about items.
transmogrify.siteanalyser.defaultpage
Determines an item is a default page for a container if it has many links to items in that container.
transmogrify.siteanalyser.relinker
Fix links in html content. Previous blueprints can adjust the '_path' and set the original path to '_origin' and relinker will fix all the img and href links. It will also normalize ids.
transmogrify.siteanalyser.attach
Find attachments which are only linked to from a single page. Attachments are merged into the linking item either by setting keys or moving it into a folder.
transmogrify.siteanalyser.title
Determine the title of an item from the link text used.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 13
-
transmogrify.siteanalyser-1.3-3.lbn19.noarch
Transmogrifier blueprints that look at how html items are linked to gather metadata about items.
transmogrify.siteanalyser.defaultpage
Determines an item is a default page for a container if it has many links to items in that container.
transmogrify.siteanalyser.relinker
Fix links in html content. Previous blueprints can adjust the '_path' and set the original path to '_origin' and relinker will fix all the img and href links. It will also normalize ids.
transmogrify.siteanalyser.attach
Find attachments which are only linked to from a single page. Attachments are merged into the linking item either by setting keys or moving it into a folder.
transmogrify.siteanalyser.title
Determine the title of an item from the link text used.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 19
-
transmogrify.sqlalchemy-1.0.1-2.lbn13.noarch
Feed data from SQLAlchemy into a transmogrifier pipeline
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 13
-
transmogrify.sqlalchemy-1.0.2-1.lbn19.noarch
This package implements a simple SQLAlchemy blueprint for collective.transmogrifier.
If you are not familiar with transmogrifier please read its documentation first to get a basic
understanding of how you can use this package.
This package implements the transmogrify.sqlalchemy blueprint which executes a SQL statement,
generally a query, and feeds the return values from that query into the transmogrifier pipeline.
Configuration
A transmogrify.sqlalchemy blueprint takes two or more parameters:
dsn
Connection information for the SQL database. The exact format is documented in the SQLAlchemy
documentation for create_engine() arguments.
query*
The SQL queries that will be executed. Any parameter starting with ‘query’ will be executed,
in sorted order.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 19
-
transmogrify.webcrawler-1.2.1-2.lbn13.noarch
A source blueprint for crawling content from a site or local html files.
Webcrawler imports HTML either from a live website, for a folder on disk, or a folder on disk with html which used to come from a live website and may still have absolute links refering to that website.
To crawl a live website supply the crawler with a base http url to start crawling with. This url must be the url which all the other urls you want from the site start with.
Located in
LBN
/
…
/
Plone and Zope
/
BastionLinux 13