Compare commits

22 Commits

Author SHA1 Message Date
f4b08fb951 lpg: improve Picture sizing, clean up
All checks were successful
Alpine 3.20 Success
Arch Linux AUR Success
Restraining a Picture in one dimension with a Frame should make it
report the right overall dimensions (keeping its aspect ratio).

Also applying the F.9 C++ Core Guideline.
2025-01-12 10:11:33 +01:00
e8752e53ac Add a Lua PDF generator
All checks were successful
Alpine 3.20 Success
Arch Linux AUR Success
Publishing my old invoice layouter in a reusable scripting-based form,
rather than an annoyingly fixed binary.

Because Lua compiled for C++ might be hard to find, we provide a wrap.
Curiously, only GitHub releases seem to contain onelua.c,
which is a very handy file.

We could have also subprojected libqr, which is in the public domain,
however the other main dependencies are also LGPL like libqrencode is.
And it is likely to be installed.

The user manual also serves as a test.
2025-01-11 15:25:13 +01:00
147b880524 Fix test.sh for Alpine's current lowriter
All checks were successful
Alpine 3.19 Success
Arch Linux AUR Success
Alpine 3.20 Success
2024-04-09 22:50:39 +02:00
a02966d1d1 README.adoc: actually make the extfs name match
All checks were successful
Arch Linux AUR Success
2024-02-04 06:35:37 +01:00
ba5fdf20df README.adoc: fix and improve Go instructions 2024-02-04 06:04:16 +01:00
a8dc72349b extfs-pdf: add a file extension for FlateDecode
It is recognised by shared-mime-info.
2024-02-04 06:03:58 +01:00
32e9acfa77 Go: enable multiple updates in a sequence
This is not something anyone should do, but let's do things correctly.
2024-02-04 05:17:26 +01:00
ff7de4b141 Go: cleanup 2024-02-04 05:16:28 +01:00
0b837b3a0e Go: add PDF 1.5 cross-reference stream support
This is not particularly complete, but it works (again) for Cairo.
2024-02-04 04:27:10 +01:00
55a17a69b7 README.adoc: update package information 2023-07-01 22:03:18 +02:00
3781aa8e85 Don't fail tests when gropdf isn't installed 2023-06-28 23:27:30 +02:00
69b939c707 Fix tests, document new limitation 2023-06-28 23:12:42 +02:00
87681d15ba Go: bump modules 2023-06-28 22:35:49 +02:00
f01d25596e Fix the man page
> Any reference to the subject of the current manual page
> should be written with the name in bold.
2022-09-25 18:28:19 +02:00
67596a8153 extfs-pdf: improve the listing format 2021-12-09 20:33:40 +01:00
8a00d7064b Update documentation 2021-12-09 15:28:01 +01:00
b358467791 Add an external VFS for Midnight Commander 2021-12-09 15:24:25 +01:00
d0f80aa6ae Go: enable listing all indirect objects 2021-12-09 14:07:15 +01:00
97ffe3d46e Go: implement stream parsing/serialization 2021-12-09 14:07:14 +01:00
1a3c7a8282 Go: add Updater.Dereference() 2021-12-08 21:33:26 +01:00
d8171b9ac4 Go: improve error handling 2021-12-08 20:49:06 +01:00
bcb24af926 Minor revision 2021-12-08 20:39:02 +01:00
19 changed files with 2218 additions and 88 deletions

View File

@@ -1,4 +1,4 @@
Copyright (c) 2017 - 2020, Přemysl Eric Janouch <p@janouch.name> Copyright (c) 2017 - 2025, Přemysl Eric Janouch <p@janouch.name>
Permission to use, copy, modify, and/or distribute this software for any Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted. purpose with or without fee is hereby granted.

View File

@@ -2,11 +2,19 @@ pdf-simple-sign
=============== ===============
'pdf-simple-sign' is a simple PDF signer intended for documents produced by 'pdf-simple-sign' is a simple PDF signer intended for documents produced by
the Cairo library, GNU troff, ImageMagick, or similar. the Cairo library (≤ 1.17.4 or using PDF 1.4), GNU troff, ImageMagick,
or similar.
I don't aim to extend the functionality any further. The project is fairly I don't aim to extend the functionality any further. The project is fairly
self-contained and it should be easy to grasp and change to suit to your needs. self-contained and it should be easy to grasp and change to suit to your needs.
Packages
--------
Regular releases are sporadic. git master should be stable enough.
You can get a package with the latest development version using Arch Linux's
https://aur.archlinux.org/packages/pdf-simple-sign-git[AUR],
or as a https://git.janouch.name/p/nixexprs[Nix derivation].
Documentation Documentation
------------- -------------
See the link:pdf-simple-sign.adoc[man page] for information about usage. See the link:pdf-simple-sign.adoc[man page] for information about usage.
@@ -25,9 +33,39 @@ Runtime dependencies: libcrypto (OpenSSL 1.1 API)
$ cd builddir $ cd builddir
$ ninja $ ninja
In addition to the C++ version, also included is a native Go port: Go
~~
In addition to the C++ version, also included is a native Go port,
which has enhanced PDF 1.5 support:
$ go get janouch.name/pdf-simple-sign/cmd/pdf-simple-sign ----
$ go install janouch.name/pdf-simple-sign/cmd/pdf-simple-sign@master
----
and a crude external VFS for Midnight Commander, that may be used to extract
all streams from a given PDF file:
----
$ GOBIN=$HOME/.local/share/mc/extfs.d \
go install janouch.name/pdf-simple-sign/cmd/extfs-pdf@master
----
To enable the VFS, edit your _~/.config/mc/mc.ext.ini_ to contain:
----
[pdf]
Type=^PDF
Open=%cd %p/extfs-pdf://
----
Lua PDF generator
~~~~~~~~~~~~~~~~~
Build dependencies: Meson, a C++17 compiler, pkg-config +
Runtime dependencies: C++ Lua >= 5.3 (custom Meson wrap fallback),
cairo >= 1.15.4, pangocairo, libqrencode
This is a parasitic subproject located in the _lpg_ subdirectory.
It will generate its own documentation.
Contributing and Support Contributing and Support
------------------------ ------------------------

141
cmd/extfs-pdf/main.go Normal file
View File

@@ -0,0 +1,141 @@
//
// Copyright (c) 2021 - 2024, Přemysl Eric Janouch <p@janouch.name>
//
// Permission to use, copy, modify, and/or distribute this software for any
// purpose with or without fee is hereby granted.
//
// THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
// WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
// MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
// SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
// WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION
// OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN
// CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
//
// extfs-pdf is an external VFS plugin for Midnight Commander.
// More serious image extractors should rewrite this to use pdfimages(1).
package main
import (
"flag"
"fmt"
"os"
"time"
"janouch.name/pdf-simple-sign/pdf"
)
func die(status int, format string, args ...interface{}) {
os.Stderr.WriteString(fmt.Sprintf(format+"\n", args...))
os.Exit(status)
}
func usage() {
die(1, "Usage: %s [-h] COMMAND DOCUMENT [ARG...]", os.Args[0])
}
func streamSuffix(o *pdf.Object) string {
if filter, _ := o.Dict["Filter"]; filter.Kind == pdf.Name {
switch filter.String {
case "JBIG2Decode":
// This is the file extension used by pdfimages(1).
// This is not a complete JBIG2 standalone file.
return "jb2e"
case "JPXDecode":
return "jp2"
case "DCTDecode":
return "jpg"
case "FlateDecode":
return "zz"
default:
return filter.String
}
}
return "stream"
}
func list(mtime time.Time, updater *pdf.Updater) {
stamp := mtime.Local().Format("01-02-2006 15:04:05")
for _, o := range updater.ListIndirect() {
object, err := updater.Get(o.N, o.Generation)
size := 0
if err != nil {
fmt.Fprintf(os.Stderr, "%s\n", err)
} else {
// Accidental transformation, retrieving original data is more work.
size = len(object.Serialize())
}
fmt.Printf("-r--r--r-- 1 0 0 %d %s n%dg%d\n",
size, stamp, o.N, o.Generation)
if object.Kind == pdf.Stream {
fmt.Printf("-r--r--r-- 1 0 0 %d %s n%dg%d.%s\n", len(object.Stream),
stamp, o.N, o.Generation, streamSuffix(&object))
}
}
}
func copyout(updater *pdf.Updater, storedFilename, extractTo string) {
var (
n, generation uint
suffix string
)
m, err := fmt.Sscanf(storedFilename, "n%dg%d%s", &n, &generation, &suffix)
if m < 2 {
die(3, "%s: %s", storedFilename, err)
}
object, err := updater.Get(n, generation)
if err != nil {
die(3, "%s: %s", storedFilename, err)
}
content := []byte(object.Serialize())
if suffix != "" {
content = object.Stream
}
if err = os.WriteFile(extractTo, content, 0666); err != nil {
die(3, "%s", err)
}
}
func main() {
flag.Usage = usage
flag.Parse()
if flag.NArg() < 2 {
usage()
}
command, documentPath := flag.Arg(0), flag.Arg(1)
doc, err := os.ReadFile(documentPath)
if err != nil {
die(1, "%s", err)
}
mtime := time.UnixMilli(0)
if info, err := os.Stat(documentPath); err == nil {
mtime = info.ModTime()
}
updater, err := pdf.NewUpdater(doc)
if err != nil {
die(2, "%s", err)
}
switch command {
default:
die(1, "unsupported command: %s", command)
case "list":
if flag.NArg() != 2 {
usage()
} else {
list(mtime, updater)
}
case "copyout":
if flag.NArg() != 4 {
usage()
} else {
copyout(updater, flag.Arg(2), flag.Arg(3))
}
}
}

4
go.mod
View File

@@ -3,6 +3,6 @@ module janouch.name/pdf-simple-sign
go 1.17 go 1.17
require ( require (
go.mozilla.org/pkcs7 v0.0.0-20200128120323-432b2356ecb1 go.mozilla.org/pkcs7 v0.0.0-20210826202110-33d05740a352
golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de golang.org/x/crypto v0.10.0
) )

4
go.sum
View File

@@ -1,8 +1,12 @@
go.mozilla.org/pkcs7 v0.0.0-20200128120323-432b2356ecb1 h1:A/5uWzF44DlIgdm/PQFwfMkW0JX+cIcQi/SwLAmZP5M= go.mozilla.org/pkcs7 v0.0.0-20200128120323-432b2356ecb1 h1:A/5uWzF44DlIgdm/PQFwfMkW0JX+cIcQi/SwLAmZP5M=
go.mozilla.org/pkcs7 v0.0.0-20200128120323-432b2356ecb1/go.mod h1:SNgMg+EgDFwmvSmLRTNKC5fegJjB7v23qTQ0XLGUNHk= go.mozilla.org/pkcs7 v0.0.0-20200128120323-432b2356ecb1/go.mod h1:SNgMg+EgDFwmvSmLRTNKC5fegJjB7v23qTQ0XLGUNHk=
go.mozilla.org/pkcs7 v0.0.0-20210826202110-33d05740a352 h1:CCriYyAfq1Br1aIYettdHZTy8mBTIPo7We18TuO/bak=
go.mozilla.org/pkcs7 v0.0.0-20210826202110-33d05740a352/go.mod h1:SNgMg+EgDFwmvSmLRTNKC5fegJjB7v23qTQ0XLGUNHk=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w= golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de h1:ikNHVSjEfnvz6sxdSPCaPt572qowuyMDMJLLm3Db3ig= golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de h1:ikNHVSjEfnvz6sxdSPCaPt572qowuyMDMJLLm3Db3ig=
golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto= golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.10.0 h1:LKqV2xt9+kDzSTfOhx4FrkEBcMrAgHSYgzywV9zcGmM=
golang.org/x/crypto v0.10.0/go.mod h1:o4eNf7Ede1fv+hwOwZsTHl9EsPFO6q6ZvYR8vYfY45I=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg= golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY= golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs= golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=

9
lpg/.clang-format Normal file
View File

@@ -0,0 +1,9 @@
BasedOnStyle: LLVM
ColumnLimit: 80
IndentWidth: 4
TabWidth: 4
UseTab: ForContinuationAndIndentation
SpaceAfterCStyleCast: true
AlignAfterOpenBracket: DontAlign
AlignOperands: DontAlign
SpacesBeforeTrailingComments: 2

1160
lpg/lpg.cpp Normal file

File diff suppressed because it is too large Load Diff

240
lpg/lpg.lua Normal file
View File

@@ -0,0 +1,240 @@
#!/usr/bin/env lpg
local project_url = "https://git.janouch.name/p/pdf-simple-sign"
function h1 (title)
return lpg.VBox {fontsize=18., fontweight=600,
title, lpg.HLine {2}, lpg.Filler {-1, 6}}
end
function h2 (title)
return lpg.VBox {fontsize=16., fontweight=600,
lpg.Filler {-1, 8}, title, lpg.HLine {1}, lpg.Filler {-1, 6}}
end
function h3 (title)
return lpg.VBox {fontsize=14., fontweight=600,
lpg.Filler {-1, 8}, title, lpg.HLine {.25}, lpg.Filler {-1, 6}}
end
function p (...)
return lpg.VBox {..., lpg.Filler {-1, 6}}
end
function code (...)
return lpg.VBox {
lpg.Filler {-1, 4},
lpg.HBox {
lpg.Filler {12},
lpg.VBox {"<tt>" .. table.concat {...} .. "</tt>"},
lpg.Filler {},
},
lpg.Filler {-1, 6},
}
end
function define (name, ...)
return lpg.VBox {
lpg.Filler {-1, 2},
lpg.Text {fontweight=600, name}, lpg.Filler {-1, 2},
lpg.HBox {lpg.Filler {12}, lpg.VBox {...}, lpg.Filler {}},
lpg.Filler {-1, 2},
}
end
function pad (widget)
return lpg.VBox {
lpg.Filler {-1, 2},
lpg.HBox {lpg.Filler {4}, widget, lpg.Filler {}, lpg.Filler {4}},
lpg.Filler {-1, 2},
}
end
local page1 = lpg.VBox {fontfamily="sans serif", fontsize=12.,
h1("lpg User Manual"),
p("<b>lpg</b> is a Lua-based PDF document generator, exposing a trivial " ..
"layouting engine on top of the Cairo graphics library, " ..
"with manual paging."),
p("The author has primarily been using this system to typeset invoices."),
h2("Synopsis"),
p("<b>lpg</b> <i>program.lua</i> [<i>args...</i>]"),
h2("API"),
p("The Lua program receives <b>lpg</b>'s and its own path joined " ..
"as <tt>arg[0]</tt>. Any remaining sequential elements " ..
"of this table represent the passed <i>args</i>."),
h3("Utilities"),
define("lpg.cm (centimeters)",
p("Returns how many document points are needed " ..
"for the given physical length.")),
define("lpg.ntoa {number [, precision=…]\n" ..
"\t[, thousands_sep=…] [, decimal_point=…] [, grouping=…]}",
p("Formats a number using the C++ localization " ..
"and I/O libraries. " ..
"For example, the following call results in “3 141,59”:"),
code("ntoa {3141.592, precision=2,\n" ..
" thousands_sep=\" \", decimal_point=\",\", " ..
"grouping=\"\\003\"}")),
define("lpg.escape (values...)",
p("Interprets all values as strings, " ..
"and escapes them to be used as literal text—" ..
"all text within <b>lpg</b> is parsed as Pango markup, " ..
"which is a subset of XML.")),
h3("PDF documents"),
define("lpg.Document (filename, width, height [, margin])",
p("Returns a new <i>Document</i> object, whose pages are all " ..
"the same size in 72 DPI points, as specified by <b>width</b> " ..
"and <b>height</b>. The <b>margin</b> is used by <b>show</b> " ..
"on all sides of pages."),
p("The file is finalized when the object is garbage collected.")),
define("<i>Document</i>.title, author, subject, keywords, " ..
"creator, create_date, mod_date",
p("Write-only PDF <i>Info</i> dictionary metadata strings.")),
define("<i>Document</i>:show ([widget...])",
p("Starts a new document page, and renders <i>Widget</i> trees over " ..
"the whole print area.")),
lpg.Filler {},
}
local page2 = lpg.VBox {fontfamily="sans serif", fontsize=12.,
h3("Widgets"),
p("The layouting system makes heavy use of composition, " ..
"and thus stays simple."),
p("For convenience, anywhere a <i>Widget</i> is expected but another " ..
"kind of value is received, <b>lpg.Text</b> widget will be invoked " ..
"on that value."),
p("Once a <i>Widget</i> is included in another <i>Widget</i>, " ..
"the original Lua object can no longer be used, " ..
"as its reference has been consumed."),
p("<i>Widgets</i> can be indexed by strings to get or set " ..
"their <i>attributes</i>. All <i>Widget</i> constructor tables " ..
"also accept attributes, for convenience. Attributes can be " ..
"either strings or numbers, mostly only act " ..
"on specific <i>Widget</i> kinds, and are hereditary. " ..
"Prefix their names with an underscore to set them privately."),
p("<i>Widget</i> sizes can be set negative, which signals to their " ..
"container that they should take any remaining space, " ..
"after all their siblings requests have been satisfied. " ..
"When multiple widgets make this request, that space is distributed " ..
"in proportion to these negative values."),
define("lpg.Filler {[width] [, height]}",
p("Returns a new blank widget with the given dimensions, " ..
"which default to -1, -1.")),
define("lpg.HLine {[thickness]}",
p("Returns a new widget that draws a simple horizontal line " ..
"of the given <b>thickness</b>.")),
define("lpg.VLine {[thickness]}",
p("Returns a new widget that draws a simple vertical line " ..
"of the given <b>thickness</b>.")),
define("lpg.Text {[value...]}",
p("Returns a new text widget that renders the concatenation of all " ..
"passed values filtered through Luas <b>tostring</b> " ..
"function. Non-strings will additionally be escaped."),
define("<i>Text</i>.fontfamily, fontsize, fontweight, lineheight",
p("Various font properties, similar to their CSS counterparts."))),
define("lpg.Frame {widget}",
p("Returns a special container widget that can override " ..
"a few interesting properties."),
define("<i>Frame</i>.color",
p("Text and line colour, for example <tt>0xff0000</tt> for red.")),
define("<i>Frame</i>.w_override",
p("Forcefully changes the child <i>Widget</i>s " ..
"requested width, such as to negative values.")),
define("<i>Frame</i>.h_override",
p("Forcefully changes the child <i>Widget</i>s " ..
"requested height, such as to negative values."))),
lpg.Filler {},
}
local page3 = lpg.VBox {fontfamily="sans serif", fontsize=12.,
define("lpg.Link {target, widget}",
p("Returns a new hyperlink widget pointing to the <b>target</b>, " ..
"which is a URL. The hyperlink applies " ..
"to the entire area of the child widget. " ..
"It has no special appearance.")),
define("lpg.HBox {[widget...]}",
p("Returns a new container widget that places children " ..
"horizontally, from left to right."),
p("If any space remains after satisfying the children widgets " ..
"requisitions, it is distributed equally amongst all of them. " ..
"Also see the note about negative sizes.")),
define("lpg.VBox {[widget...]}",
p("Returns a new container widget that places children " ..
"vertically, from top to bottom.")),
define("lpg.Picture {filename}",
p("Returns a new picture widget, showing the given <b>filename</b>, " ..
"which currently must be in the PNG format. " ..
"Pictures are rescaled to fit, but keep their aspect ratio.")),
define("lpg.QR {contents, module}",
p("Returns a new QR code widget, encoding the <b>contents</b> " ..
"string using the given <b>module</b> size. " ..
"The QR code version is chosen automatically.")),
h2("Examples"),
p("See the source code of this user manual " ..
"for the general structure of scripts."),
h3("Size distribution and composition"),
lpg.VBox {
lpg.HLine {},
lpg.HBox {
lpg.VLine {}, lpg.Frame {_w_override=lpg.cm(3), pad "3cm"},
lpg.VLine {}, lpg.Frame {pad "Measured"},
lpg.VLine {}, lpg.Frame {_w_override=-1, pad "-1"},
lpg.VLine {}, lpg.Frame {_w_override=-2, pad "-2"},
lpg.VLine {},
},
lpg.HLine {},
},
lpg.Filler {-1, 6},
code([[
<small><b>function</b> pad (widget)
<b>local function</b> f (...) <b>return</b> lpg.Filler {...} <b>end</b>
<b>return</b> lpg.VBox {f(-1, 2), lpg.HBox {f(4), w, f(), f(4)}, f(-1, 2)}
<b>end</b>
lpg.VBox {lpg.HLine {}, lpg.HBox {
lpg.VLine {}, lpg.Frame {_w_override=lpg.cm(3), pad "3cm"},
lpg.VLine {}, lpg.Frame {pad "Measured"},
lpg.VLine {}, lpg.Frame {_w_override=-1, pad "-1"},
lpg.VLine {}, lpg.Frame {_w_override=-2, pad "-2"},
lpg.VLine {},
}, lpg.HLine {}}</small>]]),
h3("Clickable QR code link"),
lpg.HBox {
lpg.VBox {
p("Go here to report bugs, request features, " ..
"or submit pull requests:"),
code(([[
url = "%s"
lpg.Link {url, lpg.QR {url, 2.5}}]]):format(project_url)),
},
lpg.Filler {},
lpg.Link {project_url, lpg.QR {project_url, 2.5}},
},
lpg.Filler {},
}
if #arg < 1 then
io.stderr:write("Usage: " .. arg[0] .. " OUTPUT-PDF..." .. "\n")
os.exit(false)
end
local width, height, margin = lpg.cm(21), lpg.cm(29.7), lpg.cm(2.0)
for i = 1, #arg do
local pdf = lpg.Document(arg[i], width, height, margin)
pdf.title = "lpg User Manual"
pdf.subject = "lpg User Manual"
pdf.author = "Přemysl Eric Janouch"
pdf.creator = ("lpg (%s)"):format(project_url)
pdf:show(page1)
pdf:show(page2)
pdf:show(page3)
end

24
lpg/meson.build Normal file
View File

@@ -0,0 +1,24 @@
project('lpg', 'cpp', default_options : ['cpp_std=c++17'],
version : '1.1.1')
conf = configuration_data()
conf.set_quoted('PROJECT_NAME', meson.project_name())
conf.set_quoted('PROJECT_VERSION', meson.project_version())
configure_file(output : 'config.h', configuration : conf)
luapp = dependency('lua++', allow_fallback : true)
cairo = dependency('cairo')
pangocairo = dependency('pangocairo')
libqrencode = dependency('libqrencode')
lpg_exe = executable('lpg', 'lpg.cpp',
install : true,
dependencies : [luapp, cairo, pangocairo, libqrencode])
# XXX: https://github.com/mesonbuild/meson/issues/825
docdir = get_option('datadir') / 'doc' / meson.project_name()
lpg_pdf = custom_target('lpg.pdf',
output : 'lpg.pdf',
input : 'lpg.lua',
command : [lpg_exe, '@INPUT@', '@OUTPUT@'],
install_dir : docdir,
build_by_default : true)

View File

@@ -0,0 +1,10 @@
[wrap-file]
directory = lua-5.4.7
source_url = https://github.com/lua/lua/archive/refs/tags/v5.4.7.tar.gz
source_filename = lua-5.4.7.tar.gz
source_hash = 5c39111b3fc4c1c9e56671008955a1730f54a15b95e1f1bd0752b868b929d8e3
patch_directory = lua-5.4.7
[provide]
lua++-5.4 = lua_dep
lua++ = lua_dep

View File

@@ -0,0 +1,20 @@
Copyright (c) 2025 Přemysl Eric Janouch <p@janouch.name>
Copyright (c) 2021 The Meson development team
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,50 @@
project(
'lua-5.4',
'cpp',
license : 'MIT',
meson_version : '>=0.49.2',
version : '5.4.7',
default_options : ['c_std=c99', 'warning_level=2'],
)
cxx = meson.get_compiler('cpp')
# Skip bogus warning.
add_project_arguments(cxx.get_supported_arguments(
'-Wno-string-plus-int', '-Wno-stringop-overflow'), language : 'cpp')
# Platform-specific defines.
is_posix = host_machine.system() in ['cygwin', 'darwin', 'dragonfly', 'freebsd',
'gnu', 'haiku', 'linux', 'netbsd', 'openbsd', 'sunos']
if is_posix
add_project_arguments('-DLUA_USE_POSIX', language : 'cpp')
endif
# Library dependencies.
lua_lib_deps = [cxx.find_library('m', required : false)]
if meson.version().version_compare('>= 0.62')
dl_dep = dependency('dl', required : get_option('loadlib'))
else
dl_dep = cxx.find_library('dl', required : get_option('loadlib'))
endif
if dl_dep.found()
lua_lib_deps += dl_dep
add_project_arguments('-DLUA_USE_DLOPEN', language : 'cpp')
endif
# Targets.
add_project_arguments('-DMAKE_LIB', language : 'cpp')
lua_lib = static_library(
'lua',
'onelua.cpp',
dependencies : lua_lib_deps,
implicit_include_directories : false,
)
inc = include_directories('.')
lua_dep = declare_dependency(
link_with : lua_lib,
include_directories : inc,
)

View File

@@ -0,0 +1,4 @@
option(
'loadlib', type : 'feature',
description : 'Allow Lua to "require" C extension modules'
)

View File

@@ -0,0 +1 @@
#include "onelua.c"

View File

@@ -14,10 +14,10 @@ executable('pdf-simple-sign', 'pdf-simple-sign.cpp',
asciidoctor = find_program('asciidoctor') asciidoctor = find_program('asciidoctor')
foreach page : ['pdf-simple-sign'] foreach page : ['pdf-simple-sign']
custom_target('manpage for ' + page, custom_target('manpage for ' + page,
input: page + '.adoc', output: page + '.1', input : page + '.adoc', output: page + '.1',
command: [asciidoctor, '-b', 'manpage', command : [asciidoctor, '-b', 'manpage',
'-a', 'release-version=' + meson.project_version(), '-a', 'release-version=' + meson.project_version(),
'@INPUT@', '-o', '@OUTPUT@'], '@INPUT@', '-o', '@OUTPUT@'],
install: true, install : true,
install_dir: join_paths(get_option('mandir'), 'man1')) install_dir : join_paths(get_option('mandir'), 'man1'))
endforeach endforeach

View File

@@ -14,7 +14,7 @@ Synopsis
Description Description
----------- -----------
'pdf-simple-sign' is a simple PDF signer intended for documents produced by *pdf-simple-sign* is a simple PDF signer intended for documents produced by
the Cairo library, GNU troff, ImageMagick, or similar. As such, it currently the Cairo library, GNU troff, ImageMagick, or similar. As such, it currently
comes with some restrictions: comes with some restrictions:

View File

@@ -64,7 +64,7 @@ std::string ssprintf(const std::string& format, Args... args) {
// ------------------------------------------------------------------------------------------------- // -------------------------------------------------------------------------------------------------
/// PDF token/object thingy. Objects may be composed either from one or a sequence of tokens. /// PDF token/object thingy. Objects may be composed either from one or a sequence of tokens.
/// The PDF Reference doesn't actually speak of tokens. /// The PDF Reference doesn't actually speak of tokens, though ISO 32000-1:2008 does.
struct pdf_object { struct pdf_object {
enum type { enum type {
END, NL, COMMENT, NIL, BOOL, NUMERIC, KEYWORD, NAME, STRING, END, NL, COMMENT, NIL, BOOL, NUMERIC, KEYWORD, NAME, STRING,
@@ -543,8 +543,8 @@ std::string pdf_updater::initialize() {
const auto prev_offset = trailer.dict.find("Prev"); const auto prev_offset = trailer.dict.find("Prev");
if (prev_offset == trailer.dict.end()) if (prev_offset == trailer.dict.end())
break; break;
// FIXME we don't check for size_t over or underflow // FIXME do not read offsets and sizes as floating point numbers
if (!prev_offset->second.is_integer()) if (!prev_offset->second.is_integer() || prev_offset->second.number < 0)
return "invalid Prev offset"; return "invalid Prev offset";
xref_offset = prev_offset->second.number; xref_offset = prev_offset->second.number;
} }

View File

@@ -1,5 +1,5 @@
// //
// Copyright (c) 2018 - 2020, Přemysl Eric Janouch <p@janouch.name> // Copyright (c) 2018 - 2024, Přemysl Eric Janouch <p@janouch.name>
// //
// Permission to use, copy, modify, and/or distribute this software for any // Permission to use, copy, modify, and/or distribute this software for any
// purpose with or without fee is hereby granted. // purpose with or without fee is hereby granted.
@@ -18,6 +18,8 @@ package pdf
import ( import (
"bytes" "bytes"
"compress/zlib"
"encoding/binary"
"encoding/hex" "encoding/hex"
"errors" "errors"
"fmt" "fmt"
@@ -59,20 +61,22 @@ const (
// higher-level objects // higher-level objects
Array Array
Dict Dict
Stream
Indirect Indirect
Reference Reference
) )
// Object is a PDF token/object thingy. Objects may be composed either from // Object is a PDF token/object thingy. Objects may be composed either from
// one or a sequence of tokens. The PDF Reference doesn't actually speak // one or a sequence of tokens. The PDF Reference doesn't actually speak
// of tokens. // of tokens, though ISO 32000-1:2008 does.
type Object struct { type Object struct {
Kind ObjectKind Kind ObjectKind
String string // Comment/Keyword/Name/String String string // Comment/Keyword/Name/String
Number float64 // Bool, Numeric Number float64 // Bool, Numeric
Array []Object // Array, Indirect Array []Object // Array, Indirect
Dict map[string]Object // Dict, in the future also Stream Dict map[string]Object // Dict, Stream
Stream []byte // Stream
N, Generation uint // Indirect, Reference N, Generation uint // Indirect, Reference
} }
@@ -118,6 +122,13 @@ func NewDict(d map[string]Object) Object {
return Object{Kind: Dict, Dict: d} return Object{Kind: Dict, Dict: d}
} }
func NewStream(d map[string]Object, s []byte) Object {
if d == nil {
d = make(map[string]Object)
}
return Object{Kind: Stream, Dict: d, Stream: s}
}
func NewIndirect(o Object, n, generation uint) Object { func NewIndirect(o Object, n, generation uint) Object {
return Object{Kind: Indirect, N: n, Generation: generation, return Object{Kind: Indirect, N: n, Generation: generation,
Array: []Object{o}} Array: []Object{o}}
@@ -458,6 +469,10 @@ func (o *Object) Serialize() string {
fmt.Fprint(b, " /", k, " ", v.Serialize()) fmt.Fprint(b, " /", k, " ", v.Serialize())
} }
return "<<" + b.String() + " >>" return "<<" + b.String() + " >>"
case Stream:
d := NewDict(o.Dict)
d.Dict["Length"] = NewNumeric(float64(len(o.Stream)))
return d.Serialize() + "\nstream\n" + string(o.Stream) + "\nendstream"
case Indirect: case Indirect:
return fmt.Sprintf("%d %d obj\n%s\nendobj", o.N, o.Generation, return fmt.Sprintf("%d %d obj\n%s\nendobj", o.N, o.Generation,
o.Array[0].Serialize()) o.Array[0].Serialize())
@@ -471,8 +486,9 @@ func (o *Object) Serialize() string {
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
type ref struct { type ref struct {
offset int64 // file offset or N of the next free entry offset int64 // file offset, or N of the next free entry, or index
generation uint // object generation generation uint // object generation
compressed *uint // PDF 1.5: N of the containing compressed object
nonfree bool // whether this N is taken (for a good zero value) nonfree bool // whether this N is taken (for a good zero value)
} }
@@ -497,6 +513,65 @@ type Updater struct {
Trailer map[string]Object Trailer map[string]Object
} }
// ListIndirect returns the whole cross-reference table as Reference Objects.
func (u *Updater) ListIndirect() []Object {
result := []Object{}
for i := 0; i < len(u.xref); i++ {
if u.xref[i].nonfree {
result = append(result, NewReference(uint(i), u.xref[i].generation))
}
}
return result
}
func (u *Updater) parseStream(lex *Lexer, stack *[]Object) (Object, error) {
lenStack := len(*stack)
if lenStack < 1 {
return newError("missing stream dictionary")
}
dict := (*stack)[lenStack-1]
if dict.Kind != Dict {
return newError("stream not preceded by a dictionary")
}
*stack = (*stack)[:lenStack-1]
length, ok := dict.Dict["Length"]
if !ok {
return newError("missing stream Length")
}
length, err := u.Dereference(length)
if err != nil {
return length, err
}
if !length.IsUint() || length.Number > math.MaxInt {
return newError("stream Length not an unsigned integer")
}
// Expect exactly one newline.
if nl, err := lex.Next(); err != nil {
return nl, err
} else if nl.Kind != NL {
return newError("stream does not start with a newline")
}
size := int(length.Number)
if len(lex.P) < size {
return newError("stream is longer than the document")
}
dict.Kind = Stream
dict.Stream = lex.P[:size]
lex.P = lex.P[size:]
// Skip any number of trailing newlines or comments.
if end, err := u.parse(lex, stack); err != nil {
return end, err
} else if end.Kind != Keyword || end.String != "endstream" {
return newError("improperly terminated stream")
}
return dict, nil
}
func (u *Updater) parseIndirect(lex *Lexer, stack *[]Object) (Object, error) { func (u *Updater) parseIndirect(lex *Lexer, stack *[]Object) (Object, error) {
lenStack := len(*stack) lenStack := len(*stack)
if lenStack < 2 { if lenStack < 2 {
@@ -590,15 +665,11 @@ func (u *Updater) parse(lex *Lexer, stack *[]Object) (Object, error) {
} }
return NewDict(dict), nil return NewDict(dict), nil
case Keyword: case Keyword:
// Appears in the document body, typically needs
// to access the cross-reference table.
//
// TODO(p): Use the xref to read /Length etc. once we
// actually need to read such objects; presumably
// streams can use the Object.String member.
switch token.String { switch token.String {
case "stream": case "stream":
return newError("streams are not supported yet") // Appears in the document body,
// typically needs to access the cross-reference table.
return u.parseStream(lex, stack)
case "obj": case "obj":
return u.parseIndirect(lex, stack) return u.parseIndirect(lex, stack)
case "R": case "R":
@@ -610,16 +681,159 @@ func (u *Updater) parse(lex *Lexer, stack *[]Object) (Object, error) {
} }
} }
func (u *Updater) loadXref(lex *Lexer, loadedEntries map[uint]struct{}) error { func (u *Updater) loadXrefEntry(
n uint, r ref, loadedEntries map[uint]struct{}) {
if _, ok := loadedEntries[n]; ok {
return
}
if lenXref := uint(len(u.xref)); n >= lenXref {
u.xref = append(u.xref, make([]ref, n-lenXref+1)...)
}
loadedEntries[n] = struct{}{}
u.xref[n] = r
}
func (u *Updater) loadXrefStream(
lex *Lexer, stack []Object, loadedEntries map[uint]struct{}) (
Object, error) {
var object Object
for {
var err error
if object, err = u.parse(lex, &stack); err != nil {
return New(End), fmt.Errorf("invalid xref table: %s", err)
} else if object.Kind == End {
return newError("invalid xref table")
}
// For the sake of simplicity, keep stacking until we find an object.
if object.Kind == Indirect {
break
}
stack = append(stack, object)
}
// ISO 32000-2:2020 7.5.8.2 Cross-reference stream dictionary
stream := object.Array[0]
if stream.Kind != Stream {
return newError("invalid xref table")
}
if typ, ok := stream.Dict["Type"]; !ok ||
typ.Kind != Name || typ.String != "XRef" {
return newError("invalid xref stream")
}
data, err := u.GetStreamData(stream)
if err != nil {
return New(End), fmt.Errorf("invalid xref stream: %s", err)
}
size, ok := stream.Dict["Size"]
if !ok || !size.IsUint() || size.Number <= 0 {
return newError("invalid or missing cross-reference stream Size")
}
type pair struct{ start, count uint }
pairs := []pair{}
if index, ok := stream.Dict["Index"]; !ok {
pairs = append(pairs, pair{0, uint(size.Number)})
} else {
if index.Kind != Array || len(index.Array)%2 != 0 {
return newError("invalid cross-reference stream Index")
}
a := index.Array
for i := 0; i < len(a); i += 2 {
if !a[i].IsUint() || !a[i+1].IsUint() {
return newError("invalid cross-reference stream Index")
}
pairs = append(pairs, pair{uint(a[i].Number), uint(a[i+1].Number)})
}
}
w, ok := stream.Dict["W"]
if !ok || w.Kind != Array || len(w.Array) != 3 ||
!w.Array[0].IsUint() || !w.Array[1].IsUint() || !w.Array[2].IsUint() {
return newError("invalid or missing cross-reference stream W")
}
w1 := uint(w.Array[0].Number)
w2 := uint(w.Array[1].Number)
w3 := uint(w.Array[2].Number)
if w2 == 0 {
return newError("invalid cross-reference stream W")
}
unit := w1 + w2 + w3
if uint(len(data))%unit != 0 {
return newError("invalid cross-reference stream length")
}
readField := func(data []byte, width uint) (uint, []byte) {
var n uint
for ; width != 0; width-- {
n = n<<8 | uint(data[0])
data = data[1:]
}
return n, data
}
// ISO 32000-2:2020 7.5.8.3 Cross-reference stream data
for _, pair := range pairs {
for i := uint(0); i < pair.count; i++ {
if uint(len(data)) < unit {
return newError("premature cross-reference stream EOF")
}
var f1, f2, f3 uint = 1, 0, 0
if w1 > 0 {
f1, data = readField(data, w1)
}
f2, data = readField(data, w2)
if w3 > 0 {
f3, data = readField(data, w3)
}
var r ref
switch f1 {
case 0:
r.offset = int64(f2)
r.generation = f3
case 1:
r.offset = int64(f2)
r.generation = f3
r.nonfree = true
case 2:
r.offset = int64(f3)
r.compressed = &f2
r.nonfree = true
default:
// TODO(p): It should be treated as a reference to
// the null object. We can't currently represent that.
return newError("unsupported cross-reference stream contents")
}
u.loadXrefEntry(pair.start+i, r, loadedEntries)
}
}
stream.Kind = Dict
stream.Stream = nil
return stream, nil
}
func (u *Updater) loadXref(lex *Lexer, loadedEntries map[uint]struct{}) (
Object, error) {
var throwawayStack []Object var throwawayStack []Object
if keyword, _ := u.parse(lex, if object, _ := u.parse(lex,
&throwawayStack); keyword.Kind != Keyword || keyword.String != "xref" { &throwawayStack); object.Kind != Keyword || object.String != "xref" {
return errors.New("invalid xref table") return u.loadXrefStream(lex, []Object{object}, loadedEntries)
} }
for { for {
object, _ := u.parse(lex, &throwawayStack) object, _ := u.parse(lex, &throwawayStack)
if object.Kind == End { if object.Kind == End {
return errors.New("unexpected EOF while looking for the trailer") return newError("unexpected EOF while looking for the trailer")
} }
if object.Kind == Keyword && object.String == "trailer" { if object.Kind == Keyword && object.String == "trailer" {
break break
@@ -627,7 +841,7 @@ func (u *Updater) loadXref(lex *Lexer, loadedEntries map[uint]struct{}) error {
second, _ := u.parse(lex, &throwawayStack) second, _ := u.parse(lex, &throwawayStack)
if !object.IsUint() || !second.IsUint() { if !object.IsUint() || !second.IsUint() {
return errors.New("invalid xref section header") return newError("invalid xref section header")
} }
start, count := uint(object.Number), uint(second.Number) start, count := uint(object.Number), uint(second.Number)
@@ -639,33 +853,29 @@ func (u *Updater) loadXref(lex *Lexer, loadedEntries map[uint]struct{}) error {
off.Number > float64(len(u.Document)) || off.Number > float64(len(u.Document)) ||
!gen.IsInteger() || gen.Number < 0 || gen.Number > 65535 || !gen.IsInteger() || gen.Number < 0 || gen.Number > 65535 ||
key.Kind != Keyword { key.Kind != Keyword {
return errors.New("invalid xref entry") return newError("invalid xref entry")
} }
free := true free := true
if key.String == "n" { if key.String == "n" {
free = false free = false
} else if key.String != "f" { } else if key.String != "f" {
return errors.New("invalid xref entry") return newError("invalid xref entry")
} }
n := start + i u.loadXrefEntry(start+i, ref{
if _, ok := loadedEntries[n]; ok {
continue
}
if lenXref := uint(len(u.xref)); n >= lenXref {
u.xref = append(u.xref, make([]ref, n-lenXref+1)...)
}
loadedEntries[n] = struct{}{}
u.xref[n] = ref{
offset: int64(off.Number), offset: int64(off.Number),
generation: uint(gen.Number), generation: uint(gen.Number),
nonfree: !free, nonfree: !free,
}, loadedEntries)
} }
} }
trailer, _ := u.parse(lex, &throwawayStack)
if trailer.Kind != Dict {
return newError("invalid trailer dictionary")
} }
return nil return trailer, nil
} }
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
@@ -695,7 +905,6 @@ func NewUpdater(document []byte) (*Updater, error) {
loadedXrefs := make(map[int64]struct{}) loadedXrefs := make(map[int64]struct{})
loadedEntries := make(map[uint]struct{}) loadedEntries := make(map[uint]struct{})
var throwawayStack []Object
for { for {
if _, ok := loadedXrefs[xrefOffset]; ok { if _, ok := loadedXrefs[xrefOffset]; ok {
return nil, errors.New("circular xref offsets") return nil, errors.New("circular xref offsets")
@@ -705,24 +914,26 @@ func NewUpdater(document []byte) (*Updater, error) {
} }
lex := Lexer{u.Document[xrefOffset:]} lex := Lexer{u.Document[xrefOffset:]}
if err := u.loadXref(&lex, loadedEntries); err != nil { trailer, err := u.loadXref(&lex, loadedEntries)
if err != nil {
return nil, err return nil, err
} }
trailer, _ := u.parse(&lex, &throwawayStack)
if trailer.Kind != Dict {
return nil, errors.New("invalid trailer dictionary")
}
if len(loadedXrefs) == 0 { if len(loadedXrefs) == 0 {
u.Trailer = trailer.Dict u.Trailer = trailer.Dict
} }
loadedXrefs[xrefOffset] = struct{}{} loadedXrefs[xrefOffset] = struct{}{}
// TODO(p): Descend into XRefStm here first, if present,
// which is also a linked list.
// We allow for mixed cross-reference tables and streams
// within a single Prev list, although this should never occur.
prevOffset, ok := trailer.Dict["Prev"] prevOffset, ok := trailer.Dict["Prev"]
if !ok { if !ok {
break break
} }
// FIXME: We don't check for size_t over or underflow. // FIXME: Do not read offsets and sizes as floating point numbers.
if !prevOffset.IsInteger() { if !prevOffset.IsInteger() {
return nil, errors.New("invalid Prev offset") return nil, errors.New("invalid Prev offset")
} }
@@ -764,18 +975,115 @@ func (u *Updater) Version(root *Object) int {
return 0 return 0
} }
func (u *Updater) getFromObjStm(nObjStm, n uint) (Object, error) {
if nObjStm == n {
return newError("ObjStm recursion")
}
stream, err := u.Get(nObjStm, 0)
if err != nil {
return stream, err
}
if stream.Kind != Stream {
return newError("invalid ObjStm")
}
if typ, ok := stream.Dict["Type"]; !ok ||
typ.Kind != Name || typ.String != "ObjStm" {
return newError("invalid ObjStm")
}
data, err := u.GetStreamData(stream)
if err != nil {
return New(End), fmt.Errorf("invalid ObjStm: %s", err)
}
entryN, ok := stream.Dict["N"]
if !ok || !entryN.IsUint() || entryN.Number <= 0 {
return newError("invalid ObjStm N")
}
entryFirst, ok := stream.Dict["First"]
if !ok || !entryFirst.IsUint() || entryFirst.Number <= 0 {
return newError("invalid ObjStm First")
}
// NOTE: This means descending into that stream if n is not found here.
// It is meant to be an object reference.
if extends, ok := stream.Dict["Extends"]; ok && extends.Kind != Nil {
return newError("ObjStm extensions are unsupported")
}
count := uint(entryN.Number)
first := uint(entryFirst.Number)
if first > uint(len(data)) {
return newError("invalid ObjStm First")
}
lex1 := Lexer{data[:first]}
data = data[first:]
type pair struct{ n, offset uint }
pairs := []pair{}
for i := uint(0); i < count; i++ {
var throwawayStack []Object
objN, _ := u.parse(&lex1, &throwawayStack)
objOffset, _ := u.parse(&lex1, &throwawayStack)
if !objN.IsUint() || !objOffset.IsUint() {
return newError("invalid ObjStm pairs")
}
pairs = append(pairs, pair{uint(objN.Number), uint(objOffset.Number)})
}
for i, pair := range pairs {
if pair.offset > uint(len(data)) ||
i > 0 && pairs[i-1].offset >= pair.offset {
return newError("invalid ObjStm pairs")
}
}
for i, pair := range pairs {
if pair.n != n {
continue
}
if i+1 < len(pairs) {
data = data[pair.offset:pairs[i+1].offset]
} else {
data = data[pair.offset:]
}
lex2 := Lexer{data}
var stack []Object
for {
object, err := u.parse(&lex2, &stack)
if err != nil {
return object, err
} else if object.Kind == End {
break
} else {
stack = append(stack, object)
}
}
if len(stack) == 0 {
return newError("empty ObjStm object")
}
return stack[0], nil
}
return newError("object not found in ObjStm")
}
// Get retrieves an object by its number and generation--may return // Get retrieves an object by its number and generation--may return
// Nil or End with an error. // Nil or End with an error.
//
// TODO(p): We should fix all uses of this not to eat the error.
func (u *Updater) Get(n, generation uint) (Object, error) { func (u *Updater) Get(n, generation uint) (Object, error) {
if n >= u.xrefSize { if n >= u.xrefSize {
return New(Nil), nil return New(Nil), nil
} }
ref := u.xref[n] ref := u.xref[n]
if !ref.nonfree || ref.generation != generation || if !ref.nonfree || ref.generation != generation {
ref.offset >= int64(len(u.Document)) { return New(Nil), nil
}
if ref.compressed != nil {
return u.getFromObjStm(*ref.compressed, n)
} else if ref.offset >= int64(len(u.Document)) {
return New(Nil), nil return New(Nil), nil
} }
@@ -796,6 +1104,14 @@ func (u *Updater) Get(n, generation uint) (Object, error) {
} }
} }
// Derefence dereferences Reference objects, and passes the other kinds through.
func (u *Updater) Dereference(o Object) (Object, error) {
if o.Kind != Reference {
return o, nil
}
return u.Get(o.N, o.Generation)
}
// Allocate allocates a new object number. // Allocate allocates a new object number.
func (u *Updater) Allocate() uint { func (u *Updater) Allocate() uint {
n := u.xrefSize n := u.xrefSize
@@ -822,8 +1138,8 @@ type BytesWriter interface {
WriteString(s string) (n int, err error) WriteString(s string) (n int, err error)
} }
// Update appends an updated object to the end of the document. The fill // Update appends an updated object to the end of the document.
// callback must write exactly one PDF object. // The fill callback must write exactly one PDF object.
func (u *Updater) Update(n uint, fill func(buf BytesWriter)) { func (u *Updater) Update(n uint, fill func(buf BytesWriter)) {
oldRef := u.xref[n] oldRef := u.xref[n]
u.updated[n] = struct{}{} u.updated[n] = struct{}{}
@@ -843,20 +1159,62 @@ func (u *Updater) Update(n uint, fill func(buf BytesWriter)) {
u.Document = buf.Bytes() u.Document = buf.Bytes()
} }
// FlushUpdates writes an updated cross-reference table and trailer. func (u *Updater) flushXRefStm(updated []uint, buf *bytes.Buffer) {
func (u *Updater) FlushUpdates() { // The cross-reference stream has to point to itself.
updated := make([]uint, 0, len(u.updated)) // XXX: We only duplicate Update code here due to how we currently buffer.
for n := range u.updated { n := u.Allocate()
updated = append(updated, n) updated = append(updated, n)
u.updated[n] = struct{}{}
u.xref[n] = ref{
offset: int64(buf.Len() + 1),
generation: 0,
nonfree: true,
} }
sort.Slice(updated, func(i, j int) bool {
return updated[i] < updated[j] index, b := []Object{}, []byte{}
write := func(f1 byte, f2, f3 uint64) {
b = append(b, f1)
b = binary.BigEndian.AppendUint64(b, f2)
b = binary.BigEndian.AppendUint64(b, f3)
}
for i := 0; i < len(updated); {
start, stop := updated[i], updated[i]+1
for i++; i < len(updated) && updated[i] == stop; i++ {
stop++
}
index = append(index,
NewNumeric(float64(start)), NewNumeric(float64(stop-start)))
for ; start < stop; start++ {
ref := u.xref[start]
if ref.compressed != nil {
write(2, uint64(*ref.compressed), uint64(ref.offset))
} else if ref.nonfree {
write(1, uint64(ref.offset), uint64(ref.generation))
} else {
write(0, uint64(ref.offset), uint64(ref.generation))
}
}
}
u.Trailer["Size"] = NewNumeric(float64(u.xrefSize))
u.Trailer["Index"] = NewArray(index)
u.Trailer["W"] = NewArray([]Object{
NewNumeric(1), NewNumeric(8), NewNumeric(8),
}) })
buf := bytes.NewBuffer(u.Document) for _, key := range []string{
startXref := buf.Len() + 1 "Filter", "DecodeParms", "F", "FFilter", "FDecodeParms", "DL"} {
buf.WriteString("\nxref\n") delete(u.Trailer, key)
}
stream := NewStream(u.Trailer, b)
fmt.Fprintf(buf, "\n%d 0 obj\n%s\nendobj", n, stream.Serialize())
}
func (u *Updater) flushXRefTable(updated []uint, buf *bytes.Buffer) {
buf.WriteString("\nxref\n")
for i := 0; i < len(updated); { for i := 0; i < len(updated); {
start, stop := updated[i], updated[i]+1 start, stop := updated[i], updated[i]+1
for i++; i < len(updated) && updated[i] == stop; i++ { for i++; i < len(updated) && updated[i] == stop; i++ {
@@ -865,8 +1223,9 @@ func (u *Updater) FlushUpdates() {
fmt.Fprintf(buf, "%d %d\n", start, stop-start) fmt.Fprintf(buf, "%d %d\n", start, stop-start)
for ; start < stop; start++ { for ; start < stop; start++ {
// XXX: We should warn about any object streams here.
ref := u.xref[start] ref := u.xref[start]
if ref.nonfree { if ref.nonfree && ref.compressed == nil {
fmt.Fprintf(buf, "%010d %05d n \n", ref.offset, ref.generation) fmt.Fprintf(buf, "%010d %05d n \n", ref.offset, ref.generation)
} else { } else {
fmt.Fprintf(buf, "%010d %05d f \n", ref.offset, ref.generation) fmt.Fprintf(buf, "%010d %05d f \n", ref.offset, ref.generation)
@@ -883,10 +1242,38 @@ func (u *Updater) FlushUpdates() {
u.Trailer["Size"] = NewNumeric(float64(u.xrefSize)) u.Trailer["Size"] = NewNumeric(float64(u.xrefSize))
trailer := NewDict(u.Trailer) trailer := NewDict(u.Trailer)
fmt.Fprintf(buf, "trailer\n%s", trailer.Serialize())
}
fmt.Fprintf(buf, "trailer\n%s\nstartxref\n%d\n%%%%EOF\n", // FlushUpdates writes an updated cross-reference table and trailer, or stream.
trailer.Serialize(), startXref) func (u *Updater) FlushUpdates() {
updated := make([]uint, 0, len(u.updated))
for n := range u.updated {
updated = append(updated, n)
}
sort.Slice(updated, func(i, j int) bool {
return updated[i] < updated[j]
})
// It does not seem to be possible to upgrade a PDF file
// from trailer dictionaries to cross-reference streams,
// so keep continuity either way.
//
// (Downgrading from cross-reference streams using XRefStm would not
// create a true hybrid-reference file, although it should work.)
buf := bytes.NewBuffer(u.Document)
startXref := buf.Len() + 1 /* '\n' */
if typ, _ := u.Trailer["Type"]; typ.Kind == Name && typ.String == "XRef" {
u.flushXRefStm(updated, buf)
} else {
u.flushXRefTable(updated, buf)
}
fmt.Fprintf(buf, "\nstartxref\n%d\n%%%%EOF\n", startXref)
u.Document = buf.Bytes() u.Document = buf.Bytes()
u.updated = make(map[uint]struct{})
u.Trailer["Prev"] = NewNumeric(float64(startXref))
} }
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
@@ -904,17 +1291,47 @@ func NewDate(ts time.Time) Object {
return NewString(string(buf)) return NewString(string(buf))
} }
// GetStreamData returns the actual data stored in a stream object,
// applying any filters.
func (u *Updater) GetStreamData(stream Object) ([]byte, error) {
if f, ok := stream.Dict["F"]; ok && f.Kind != Nil {
return nil, errors.New("stream data in other files are unsupported")
}
// Support just enough to decode a common cross-reference stream.
if filter, ok := stream.Dict["Filter"]; !ok {
return stream.Stream, nil
} else if filter.Kind != Name || filter.String != "FlateDecode" {
return nil, errors.New("unsupported stream Filter")
}
// TODO(p): Support << /Columns N /Predictor 12 >>
// which usually appears in files with cross-reference streams.
if parms, ok := stream.Dict["DecodeParms"]; ok && parms.Kind != Nil {
return nil, errors.New("DecodeParms are not supported")
}
r, err := zlib.NewReader(bytes.NewReader(stream.Stream))
if err != nil {
return nil, err
}
var b bytes.Buffer
_, err = b.ReadFrom(r)
return b.Bytes(), err
}
// GetFirstPage retrieves the first page of the given page (sub)tree reference, // GetFirstPage retrieves the first page of the given page (sub)tree reference,
// or returns a Nil object if unsuccessful. // or returns a Nil object if unsuccessful.
func (u *Updater) GetFirstPage(nodeN, nodeGeneration uint) Object { func (u *Updater) GetFirstPage(node Object) Object {
obj, _ := u.Get(nodeN, nodeGeneration) obj, err := u.Dereference(node)
if obj.Kind != Dict { if err != nil || obj.Kind != Dict {
return New(Nil) return New(Nil)
} }
// Out of convenience; these aren't filled normally. // Out of convenience; these aren't filled normally.
obj.N = nodeN obj.N = node.N
obj.Generation = nodeGeneration obj.Generation = node.Generation
if typ, ok := obj.Dict["Type"]; !ok || typ.Kind != Name { if typ, ok := obj.Dict["Type"]; !ok || typ.Kind != Name {
return New(Nil) return New(Nil)
@@ -934,7 +1351,7 @@ func (u *Updater) GetFirstPage(nodeN, nodeGeneration uint) Object {
} }
// XXX: Nothing prevents us from recursing in an evil circular graph. // XXX: Nothing prevents us from recursing in an evil circular graph.
return u.GetFirstPage(kids.Array[0].N, kids.Array[0].Generation) return u.GetFirstPage(kids.Array[0])
} }
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
@@ -1128,7 +1545,10 @@ func Sign(document []byte, key crypto.PrivateKey, certs []*x509.Certificate,
if !ok || rootRef.Kind != Reference { if !ok || rootRef.Kind != Reference {
return nil, errors.New("trailer does not contain a reference to Root") return nil, errors.New("trailer does not contain a reference to Root")
} }
root, _ := pdf.Get(rootRef.N, rootRef.Generation) root, err := pdf.Dereference(rootRef)
if err != nil {
return nil, fmt.Errorf("Root dictionary retrieval failed: %s", err)
}
if root.Kind != Dict { if root.Kind != Dict {
return nil, errors.New("invalid Root dictionary reference") return nil, errors.New("invalid Root dictionary reference")
} }
@@ -1182,7 +1602,7 @@ func Sign(document []byte, key crypto.PrivateKey, certs []*x509.Certificate,
if !ok || pagesRef.Kind != Reference { if !ok || pagesRef.Kind != Reference {
return nil, errors.New("invalid Pages reference") return nil, errors.New("invalid Pages reference")
} }
page := pdf.GetFirstPage(pagesRef.N, pagesRef.Generation) page := pdf.GetFirstPage(pagesRef)
if page.Kind != Dict { if page.Kind != Dict {
return nil, errors.New("invalid or unsupported page tree") return nil, errors.New("invalid or unsupported page tree")
} }
@@ -1204,7 +1624,7 @@ func Sign(document []byte, key crypto.PrivateKey, certs []*x509.Certificate,
}) })
// 8.6.1 Interactive Form Dictionary // 8.6.1 Interactive Form Dictionary
if _, ok := root.Dict["AcroForm"]; ok { if acroform, ok := root.Dict["AcroForm"]; ok && acroform.Kind != Nil {
return nil, errors.New("the document already contains forms, " + return nil, errors.New("the document already contains forms, " +
"they would be overwritten") "they would be overwritten")
} }

23
test.sh
View File

@@ -11,11 +11,15 @@ mkdir tmp
# Create documents in various tools # Create documents in various tools
log "Creating source documents" log "Creating source documents"
inkscape --pipe --export-filename=tmp/cairo.pdf <<'EOF' 2>/dev/null || : inkscape --pipe --export-filename=tmp/cairo.pdf --export-pdf-version=1.4 \
<<'EOF' 2>/dev/null || :
<svg xmlns="http://www.w3.org/2000/svg"><text x="5" y="10">Hello</text></svg> <svg xmlns="http://www.w3.org/2000/svg"><text x="5" y="10">Hello</text></svg>
EOF EOF
date | tee tmp/lowriter.txt | groff -T pdf > tmp/groff.pdf || : date > tmp/lowriter.txt
if command -v gropdf >/dev/null
then groff -T pdf < tmp/lowriter.txt > tmp/groff.pdf
fi
lowriter --convert-to pdf tmp/lowriter.txt --outdir tmp >/dev/null || : lowriter --convert-to pdf tmp/lowriter.txt --outdir tmp >/dev/null || :
convert rose: tmp/imagemagick.pdf || : convert rose: tmp/imagemagick.pdf || :
@@ -45,7 +49,11 @@ openssl x509 -req -in tmp/cert.csr -out tmp/cert.pem \
-CA tmp/ca.cert.pem -CAkey tmp/ca.key.pem -set_serial 1 \ -CA tmp/ca.cert.pem -CAkey tmp/ca.key.pem -set_serial 1 \
-extensions smime -extfile tmp/cert.cfg 2>/dev/null -extensions smime -extfile tmp/cert.cfg 2>/dev/null
openssl verify -CAfile tmp/ca.cert.pem tmp/cert.pem >/dev/null openssl verify -CAfile tmp/ca.cert.pem tmp/cert.pem >/dev/null
# The second line accomodates the Go signer,
# which doesn't support SHA-256 within pkcs12 handling
openssl pkcs12 -inkey tmp/key.pem -in tmp/cert.pem \ openssl pkcs12 -inkey tmp/key.pem -in tmp/cert.pem \
-certpbe PBE-SHA1-3DES -keypbe PBE-SHA1-3DES -macalg sha1 \
-export -passout pass: -out tmp/key-pair.p12 -export -passout pass: -out tmp/key-pair.p12
for tool in "$@"; do for tool in "$@"; do
@@ -55,6 +63,12 @@ for tool in "$@"; do
result=${source%.pdf}.signed.pdf result=${source%.pdf}.signed.pdf
$tool "$source" "$result" tmp/key-pair.p12 "" $tool "$source" "$result" tmp/key-pair.p12 ""
pdfsig -nssdir sql:tmp/nssdir "$result" | grep Validation pdfsig -nssdir sql:tmp/nssdir "$result" | grep Validation
# Only some of our generators use PDF versions higher than 1.5
log "Testing $tool for version detection"
grep -q "/Version /1[.]6" "$result" \
|| grep -q "^%PDF-1[.][67]" "$result" \
|| die "Version detection seems to misbehave (no upgrade)"
done done
log "Testing $tool for expected failures" log "Testing $tool for expected failures"
@@ -63,11 +77,6 @@ for tool in "$@"; do
$tool -r 1 "$source" "$source.fail.pdf" tmp/key-pair.p12 "" \ $tool -r 1 "$source" "$source.fail.pdf" tmp/key-pair.p12 "" \
&& die "Too low reservations shouldn't succeed" && die "Too low reservations shouldn't succeed"
# Our generators do not use PDF versions higher than 1.5
log "Testing $tool for version detection"
grep -q "/Version /1.6" "$result" \
|| die "Version detection seems to misbehave (no upgrade)"
sed '1s/%PDF-1../%PDF-1.7/' "$source" > "$source.alt" sed '1s/%PDF-1../%PDF-1.7/' "$source" > "$source.alt"
$tool "$source.alt" "$result.alt" tmp/key-pair.p12 "" $tool "$source.alt" "$result.alt" tmp/key-pair.p12 ""
grep -q "/Version /1.6" "$result.alt" \ grep -q "/Version /1.6" "$result.alt" \