How to serve up print-documents on the web?

23 11 2009

In a quest for continuous improvement, we’re having a think about our study guides. Most authors have their documents online for students, be it PDFs (mostly) or word docs.. what the students do with those documents, I’m not entirely sure, and as far as I know, there’s no stats or research that might answer that question.

Even the stats for the eStudyGuide pages (discussed in earlier post – taking the xml from indesign to make a web front end for the study guides) don’t indicate any trend, the stats only show that people are looking at the front end, from where, for how long etc. Actually, there’s a possible answer there, I wonder if there’s any server stats on the PDFs, would the stats indicate how long the document is open in the browser, or would the stats just record the hit? I’ll have to find out. There’s got to be some data we can look at to know more about usage.

In the mean time, tell me what you would do with a PDF study guide if you were studying:

So, anyway, back to the point of this, what is really the best way to serve up the documents online? If students are indeed reading on screen, perhaps we should put some time into looking at a better way to present it all, Adobe Digital Editions for instance might be a winner. Like Acrobat Reader, its a free program, but it’s for managing and viewing eBooks. Students would be able to manage and read all their study guides on their computer (making them more convenient, mobile, and so on), plus, the epub format would be easy as to spit out of InDesign (what our study guides are created in).

Incrementing and logic in XSL

17 11 2009

I’ve been learning XSL for the design of the learning guide I mentioned in my last post. I just had a bit of a win with incrementing in the XSL! I searched the net far and wide and didn’t find anything that suited what I wanted in terms of incrementing, most things didn’t even work anyway.

The XML I’m transforming describes the content in terms of “sections”. The XSL transforms those sections into a jquery tabbed interface. So basically, I need the title attributes of the tab links, to match the id attributes of the div containers. I also need these to be unique for the jquery tabs to work. Thus, I wanted to increment the values.

Here’s a sample of the XML, just a simple example – an item is a module in the learning guide, it contains various information, part of which includes the actual ‘content’ for that module:

<section title="Start">
<sectionContent>first section of this module</sectionContent>
<section title="Middle one">
<sectionContent>content for middle section</sectionContent>
<section title="Finish">
<sectionContent>content for third section here</sectionContent>

Then, the XSL that transforms and increments. I had it all around the wrong way initially, I was trying to use variables and addition, loops etc, to do my incrementing.. but really, all I had to do was identify the node position and use that as a parameter in my for-each loop!

<div id="tabs">
	<xsl:for-each select="itemContent/section">                    
		<xsl:param name="i"><xsl:number value="position()" /></xsl:param>
		<a href="#" title="Section_{$i}"><xsl:value-of select="@title" /></a>              		
	<div class="clear"></div>

<xsl:for-each select="itemContent/section">
	<xsl:param name="i"><xsl:number value="position()" /></xsl:param>
	<div id="Section_{$i}" class="hiddencontent">
		<h3><xsl:value-of select="@title" /></h3>
		<xsl:copy-of select="sectionContent" />

It works a charm and it renders like this in the browser:

HTML render of the XSL incrementing

Note the incrementing and matching title/id attributes..

Media-rich Learning Guide

6 11 2009

We had this little problem last year – we really wanted SOMETHING to do an online study/learning guide with.. well not just ‘something’, we wanted it to be media rich in that it is not just visually appealing but allowed all sorts of GUI features (like popups, tabs etc) so that it could have more media-rich content.

The reason at the time was because the course we were working on had SO many resources, from all over the place, and we thought it might be a good thing to bring everything together, in to context.  (and to make it look nice).

I started thinking of ideas for a GUI and realised one of my favourite interfaces would have to be iTunes!  The interface is called cover flow and it’s visually delicious to say the least.

iTunes coverflow

This is the iTunes coverflow GUI, nice hey?

As it turned out, after some searching the net for Flash resources, I found this awesome Open Source Initiative Flash-based cover flow project.  The beauty of it, like the beauty of iTunes, is it basically gets all its data from an XML file.  So that got me thinking.

What if the learning guide was entirely in XML?  Well, why not.  You could request a course code in your web browser, a php page could parse your request to Flash, which then loads the XML learning guide menu in coverflow, and voila, ajax the content in to the page with each click in the flash cover-flow!

I modified the Flash-based coverflow to suite our purpose.  I made the initial course trial static, in that the content was being loaded from html files, and the course-specific XML only contained details on the course topics and titles etc.  Now however I’m working on the full deal, the entire learning guide in XML.  It’s all coming together rather nicely.


We've nick-named it the HLG (hypertextual learning guide). You select your study module from the coverflow menu, and all the content loads below via ajax. The content sits in the course XML, and the XSL handles the rendering of all the variations in that content. This example has some readings, but there could also be videos, podcasts, all sorts of stuff.

Learning XSL has been a highlight of the last week, I’m still quite a noob at it, but so far I’ve done some awesome things – I can handle all sorts of variations in the XML (which means variations in content) and render it on the page nicely – videos come up with a video icon and use the jQuery prettyBox plugin, documents come up with a document icon, abstracts have the abstract viewable via a jQuery clueTip plugin.. its all coming together.

The setup

I have some more work to do with the XML/XSL, but I think it will all go well. At the end of it, I'll basically have a single php file, that takes course code requests.. the page will talk to a library of standard resources like the images in the cover-flow menu, the background images, css, and javascripts. It'll then talk to the requested XML (via a php transform and XSL file) to get all the details and content.

Phase 3 will be doing an administration interface, to manage the XML.. ohh fun.

Web front end for InDesign docs

5 11 2009

Well well.. I’ve been out of touch with the InDesign stuff I was really getting in to last year.  Other projects, as well as the restructure and fallout from that.  I did however pick things up a little just recently, and finished making a first draft front end to display the Study Guides from InDesign.

As you may/may not know, part of our InDesign process with the study guides allows us to export XML from the Table Of Contents (TOC).  This is a great spot to get some really usable data about the document, for instance, how many chapters and what are the chapter titles!

After creating the PDFs, and exporting the XML from the TOC, we simply dump it all on a web server.  I made a nice little PHP page that takes requests and displays an interface for students to download those PDFs – the PHP reads the requested course’s XML file, creates the list with the chapter titles and so on.

Why’d I bother doing this?  A couple of reasons really… Well okay a few:

  • Its fun.
  • It saves academics time by setting up a page for them where students can download the chapterised PDFs for their Study Guide.
  • It also saves the academic the time of splitting their whole document PDFs up in to chapter PDFs, because we do it.

It works well for us really, I mean, all we do is export the goods and throw it on a server, it couldn’t be more simple.  The outcome is really quite nice.  Next challenge will be how to integrate this more with the new LMS, Moodle.

e Study Guide Page

This is the end result, the web front end for the InDesign XML & PDFs

In terms of the web front end, it was quite easy with PHP to read the XML.  I didn’t need to do anything fancy at all, I suppose it is pretty simple XML.  In fact, I used the simpleXML php module.

By parsing in the courseID as a variable in the URL, the page loads the requested XML file

//capture courseID from URL

//load the xml that contains the info on the documents
$pathExt = ".xml";
$docInfoPath = $courseID . "/eStudyGuide/" . $courseID . $pathExt;
$docInfoXML = simplexml_load_file($docInfoPath);
$docTitles = array();

Then looping through to display the docs was easy as this!

$i = 1;

foreach ($docInfoXML->TOClev1 as $docTitles) {
	if (strlen($docTitles)!=3) {
		echo "<li>
			<a href='" . $courseID . "/eStudyGuide/" . $courseID . "_" . $i . ".pdf'>" . str_replace('    ', ":  pg", $docTitles) . "</a></li>";                            

Looking at InCopy for our Study Guides

16 12 2008

Given the current situation at the University, I want to see how we can continue to improve the process we started with our Study Guides. (earlier posts explain this process, including InDesign and it’s XML features to do various things).

I thought, I might as well look at InCopy.. So.. I’ve looked at it, and I’ve been thinking. It seems like a good tool for starters, I like the whole assignment based workflow idea where multiple authors can work on pieces of a document simultaneously. However. At this time I don’t see the value in adding InCopy to the study guide process and I’m going to abort my testing on it. Unless we significantly change things, it’s not going to impact or improve our workloads in any great beneficial way as far as I can see.

To get big gains on workflow we’d need to use InCopy completely instead of MS Word, and thus we would need to either:

ONE: get our real authors to use InCopy, our lecturers would then be more directly involved in their study material creation, but the downside is I’m sure they wouldn’t have time..? We would then take the designer role and manage the content and assignments – just like a real publishing house. OR;

TWO: get InDesign Server, develop a front end which lets authors edit their content without needing InCopy. This would take time, money, money and I don’t know what else! It’d be great, extensible, but we’d need to be careful and ensure that it met our real needs and improved our processes without getting carried away.

So, where to next? Who knows.

This is what I’d do if I had unlimited skills, time, and money from the University:

Look at MS Word’s XML docx format, investigate how we can effectively (automatically) transport this structured XML content in to InDesign, or generally just a more usable XML format than straight docx.

It’d be great to have a front end where the content could then be checked out for any needed updates – checked out by Authors or Designers – then at a certain date (or manually) the server does all the work.. TOCs, tagging, footers, PDFs for print and online, XML exports, web pages to display the content as well, the whole lot.

Hrm, wouldn’t that be nice.

Thoughts on the 2008 Ascilite conference so far

1 12 2008


Okay, the first thing I have probably learned is that it’s probably a good idea to borrow a laptop before coming.  It’d make note taking much easier among other things!

The first day of the conference went well, consisting of a half day workshop on educating the NET generation.  It was a non stop discussion which was really interesting.  Basically it is interesting to hear what people think about the so called NET generation.. I’m not convinced though.

Obviously trends show that social networking and technological tools in general are becoming increasingly ‘natural’ to a lot of people.  I find that these people aren’t narrowed down by an age group or a certain demographic however, as the whole idea of the ‘net generation’ seems to suggest. Millions of things contribute to how one understands and uses technology in every day life.. I always grew up with technology, my parents always gave me access to it and used technology themselves so in one way this would have shaped my natural uptake and use of these tools.

After today I had the feeling that a lot people tend to talk about web2 tools as something for students to use, something to base objectives or actual activities around.. I’d personally like to see more staff USE these technologies to TEACH.. use them to teach and see where that goes.  VoiceThread for example would be an awesome mode of delivery for online lecturers or even tutorials, no matter what discipline.  Students could also co-contribute and everyone would be hunky dorry… however you spell that.

Anyway, before I ramble on too much; the discussion showed some really interesting student trends, based on studies taken in 2006.. something to bring home and have a look at.  Overall the workshop was good, and fun.

Day 1:

So far today has been .. interesting.  From the workshops so far I’ve seen some web2 tools that I’ve never heard of, so I’ll be sure to keep a list of them, and suss them out when I return back to work.  One in particular would be VEX; a tool that combines various web2 features in one spot, blogs and more.  I like the idea of a single tool to do many things, much more than half a dozen tools to do separate things.

The whole notion of students as co-contributors of content and thus their own learning is popping up in most sessions.  Again I think of VoiceThread having great potential but I’m sure there’s so many others that I don’t know of yet.

In talk #2, session 1, we heard from a University that had been trialling a trigger enabled SMS system, this one was also pretty interesting.  Students opt to sign up for the service, and once that’s done they can easily request assessment results, exam times and dates, dates of upcoming assignments, and much more.  It wasn’t a request based system only, students also got personalised reminders close to due dates and so forth.  Overall the feedback generated from student surveys indicated that it was more than moderately successful, I personally think it’s a pretty good idea, perhaps not in terms of ‘learning and teaching’ as such, but for basic student convenience and motivation..

Talk #3 was a very unique presentation; rather than presenting strict evidence or theory it gave us an insight into what students think about using mobile web2 technologies.  Students were supplied with equipment such as iPhones by the University in a course on Product Design (and others); the clips we saw really showed how enjoyable this was for students.. no doubt this contributed to their overall course motivation.  Who wouldn’t be happy being given an iPhone to play with for a semester!!!

Summary of Day 1:

I’m finding that the talks are very short and rushed, which is a shame, because a lot of these presenters have so much more to offer.

From what I’ve seen so far, we are MORE than keeping up with most other Universities in terms of thinking about how to incorporate web2 tools TO enhance learning AND teaching.  We’re doing a good job, we just need more people power!!

Over and out for today.

Style maps behaving strangely in InDesign CS4

6 11 2008

How weird, the style mapping for importing word docs to InDesign CS4 seems to be really flaky.  

I’ve done a lot of things; originally I just plonked my old SMP files in to CS4’s word import presets directory – that didn’t work.  It picked up the smp file upon import, but when I viewed the style maps all the mapping info was simply not there.  Needless to say the mapping didn’t work in the slightest after placing the word doc.

So, attempt two; I manually re-created the style maps, saved a new preset file, and placed my word doc in.  No go again – completely ignored what I’d told it to do.

So I’m wondering, ‘maybe there’s some style map conflicts’.. So I go to place the doc in again; and see that there was only minimal conflics, say 3 conflicts, when about 15 styles didn’t map – so that didn’t add up.

I checked the presets that I’d created in attempt 2, and just like attempt one, everything I’d already entered in was gone.  Weird.

So, I re-created again, attempt number 3.  I then looked at the SMP files in dreamweaver, since they’re basically just XML.  The XML had all the info in, but still the maps didn’t work as they hadn’t in attempt 1 and 2.  I decided to replace the XML with my old XML – that didn’t work either.

I dunno what’s going on with it, but it’s got issues.. until I get to the bottom of it I’m very un-impressed.

Cell padding in InDesign

13 08 2008

We had a fair few hiccups in our overall Word to InDesign conversion recently (see previous posts regarding study guides and InDesign).  One of the problems involved table and cell styles in InDesign not applying the specified cell padding!

This was really frustrating, because even though you’d applied the table/cell styles to the tables in the document, you still had to manually add the cell padding in.  .. and yes, cell padding is entered in on the cell style specs so there’s no reason I can think of why it’d get ignored.  It is possible that the settings from Word were overriding the cell specs.. That’s probably all it was actually.

Anyway, a solution was found by trawling through the Adobe InDesign Exchange for third party scripts.  I ended up finding ‘TableStyle’, which was fantastic.  It lets you specify a whole range of table formatting, and does a selection or an entire document automatically.  All I had to do was copy the script into InDesign’s script panel directory and voila, double click on the scrip from the automation panel in InDesign and the whole document is done.

The only tweaking I did was to edit out the code which updated table borders and colours, since they interfered with our existing table styles.  That was only a quick 1 min javascript edit though.  Easy!

Overall I’m impressed by the scripting capabilities of InDesign, I’ll be looking in to what else we can do in the future to get around various problems.

Import XML into InDesign CS3

9 05 2008

The scenario:

We have 100 odd study guides, each require the creation of an overprint PDF containing course and faculty information, as well as a barcode and item number. The overprints are printed onto an already produced cover. It saves time, money and makes sense for us really.

The problem associated with that is, no one really has time to manually create 100 pages in InDesign, and copy and paste course information and barcode information from a number of sources, yuk! Even if we did have time, I don’t like the changes or stuffing up one of the barcodes, since you can’t even read what you’ve typed or pasted in.

The solution:

Pull the course, faculty and barcode information from a Database; create a simple XML file, and import into InDesign!

How the process went:

From what I’d read on the net, I assumed that when one imports XML into InDesign, all your pages are automatically created.  Maybe I missed something but I found that I had to actually create the pages manually, which isn’t really a problem as it’s only a few mouse clicks.

The next subtle thing I found which seemed to be missing from all of the instructions and tips I read on the net, was that the pages didn’t actually take on the tagged structure of their master page!  Okay, that was easy, all I had to do was tell the pages to Over-ride their master; and once I’d done that, DELETE the content of the master page (or else the imported xml goes in to the master page as well as your normal pages).

Despite everything looking perfect the XML wouldn’t import correctly; I ended up having some data missing on a page, then it’d be on the next page; basically stuff was all over the place.  

I found out, after going around in circles for ages; that when I’d told my pages to over-ride the master page – they didn’t take on the XML structure exactly as it was on the master!!!!  They all had their tags in the wrong bloody order!

So then all I did was just adjust the order of tags in my XML source, it still made sense so that was fine with me. 

Once I’d done that.. it worked perfectly!  XML in to InDesign does work, it’s just a little bit annoying sometimes!


Problems with Second Life recording

21 04 2008

We’re nearly finished the Machinima we’re doing in Second Life – today we were supposed to record the final scenes, but we had a handful of problems.

The first problem, was that our recordings played back at around twice the speed that they should have! So you can imagine what that looked like, absurd!! We tried a number of things, nothing of which seemed to help:

  • Tried several different recording codecs
  • Tried varying all of the recording options; Key frames etc
  • Tried adjusting the visual settings (reduce the graphics quality incase the frames weren’t synching properly)
  • Tried restarting the client; then tried restarting the computer

The scene involved several people having afternoon drinks, each avatar had a wine glass with a drinking animation. We thought perhaps all of the avatar animation was playing a part so we took the wine glasses away and that seemed to fix the problem. To compromise we’ve left 2 of the 7 avatars with glasses, and we’ve put in some little tables to sit static glasses on.

So it seems that all of the avatar animation was causing the client to loose frames when recording (it couldn’t keep up).. so when we played back the video it was way too fast.

The second problem, which we haven’t managed to solve completely; is that we’re finding the light on the avatars is flicking between brighter and darker when the camera moves in and away from the avatars. While not major this is really annoying and it looks very dodgy at certain times of day/night. We’ve experimented putting some lights closer to all the avatars, and making sure the fall off distance isn’t less than the camera distance – this seemed to help a little when coupled with fiddling with the environment time. The flicker still happens but it’s not as noticeable now.

So, after a dismal day we will continue in the morning and hopefully these problems will be all ironed out.