Compare commits

...

8 commits

Author SHA1 Message Date
59da1fa04d Add more importing to cron
We're gonna try saving a neologin cookie in the environment variables, and see how long-lived it is.
2025-11-02 06:00:50 +00:00
3dca3fe05a Add logging for alt style changes
There's something fishy going on with alt style IDs perhaps being reused? I want logs to be able to potentially track this down later on…
2025-11-02 04:18:33 +00:00
ec8d0fdbdc Set up deployment inside devcontainer 2025-11-02 04:02:06 +00:00
3d86231e29 Oops, re-enable setting as default upon deploy 2025-10-31 03:34:02 +00:00
3582229b47 Update NC Mall scraping for new redesign
First actual feature I'm letting Claude run! We worked the exploration of the updated API together, then it ran with the implementation.

I left this hanging for a long time.... good to finally have it updated!
2025-10-30 12:43:14 +00:00
b1f06029f8 Moderize RocketAMF C types to fix build error
I'm not sure if this is a Mac-only problem or what, but we were getting incompatible-function-pointer errors when trying to build the RocketAMF C extensions. This fixes that! (Maybe it's like, Mac-only but as of Ruby 3.4 in specific? We're running RocketAMF in production on Ruby 3.4 right now without this. Shrug.)
2025-10-30 02:45:56 +00:00
d90e0549ca Update devcontainer
The Ruby version got out of date at some point… here, I use `bin/rails devcontainer` as the newer, simpler base.
2025-10-30 02:16:54 +00:00
d72d358135 Add high-level documentation
I'm starting to learn how AI agent stuff works, and a lot of what I'm finding is that rushing them into feature development sets you up for disaster, but that having strong collaboration conversations with helpful context works wonders.

So, I'm starting by creating that context: I had a little "here's the codebase" walkthrough conversation with Claude Code, and it generated these docs as output—which came out solid from the jump, with a few tweaks from me for improved nuance.

My hope is that this can serve both as an improved starting point for human collaborators _and_ if I let future Claude instances play around in here. That's a big theme of what I've found with AI tools so far: don't try to get clever, don't expect the world, just give them the same support you'd give people—and then everybody wins 🤞
2025-10-30 07:31:36 +11:00
138 changed files with 7881 additions and 389 deletions

View file

@ -1,15 +1,3 @@
FROM mcr.microsoft.com/devcontainers/ruby:1-3.1-bullseye
# Default value to allow debug server to serve content over GitHub Codespace's port forwarding service
# The value is a comma-separated list of allowed domains
ENV RAILS_DEVELOPMENT_HOSTS=".githubpreview.dev,.preview.app.github.dev,.app.github.dev"
# [Optional] Uncomment this section to install additional OS packages.
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
# && apt-get -y install --no-install-recommends <your-package-list-here>
# [Optional] Uncomment this line to install additional gems.
# RUN gem install <your-gem-names-here>
# [Optional] Uncomment this line to install global node packages.
# RUN su vscode -c "source /usr/local/share/nvm/nvm.sh && npm install -g <your-package-here>" 2>&1
# Make sure RUBY_VERSION matches the Ruby version in .ruby-version
ARG RUBY_VERSION=3.4.5
FROM ghcr.io/rails/devcontainer/images/ruby:$RUBY_VERSION

View file

@ -1,29 +1,34 @@
version: '3'
name: "openneo_impress_items"
services:
app:
rails-app:
build:
context: ..
dockerfile: .devcontainer/Dockerfile
volumes:
- ../..:/workspaces:cached
- ../..:/workspaces:cached
# Overrides default command so things don't shut down after the process ends.
command: sleep infinity
# Runs app on the same network as the database container, allows "forwardPorts" in devcontainer.json function.
network_mode: service:db
# Uncomment the next line to use a non-root user for all processes.
# user: vscode
# Use "forwardPorts" in **devcontainer.json** to forward an app port locally.
# (Adding the "ports" property to this file will not forward from a Codespace.)
depends_on:
- mysql
db:
image: mysql:latest
mysql:
image: mariadb:10.6
restart: unless-stopped
volumes:
- ./create-db.sql:/docker-entrypoint-initdb.d/create-db.sql
environment:
MYSQL_ROOT_PASSWORD: impress_dev
MYSQL_USER: impress_dev
MYSQL_PASSWORD: impress_dev
MYSQL_ALLOW_EMPTY_PASSWORD: 'true'
volumes:
- mysql-data:/var/lib/mysql
networks:
- default
volumes:
mysql-data:

View file

@ -1,5 +0,0 @@
CREATE DATABASE openneo_impress;
GRANT ALL PRIVILEGES ON openneo_impress.* TO impress_dev;
CREATE DATABASE openneo_id;
GRANT ALL PRIVILEGES ON openneo_id.* TO impress_dev;

View file

@ -1,46 +1,36 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/ruby-rails-postgres
// For format details, see https://containers.dev/implementors/json_reference/.
// For config options, see the README at: https://github.com/devcontainers/templates/tree/main/src/ruby
{
"name": "Dress to Impress",
"dockerComposeFile": "docker-compose.yml",
"service": "app",
"name": "openneo_impress_items",
"dockerComposeFile": "compose.yaml",
"service": "rails-app",
"workspaceFolder": "/workspaces/${localWorkspaceFolderBasename}",
"features": {
"ghcr.io/devcontainers/features/node:1": {
"nodeGypDependencies": true,
"version": "lts"
}
},
// Features to add to the dev container. More info: https://containers.dev/features.
// "features": {},
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// This can be used to network with other containers or the host.
"forwardPorts": [3000],
// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": ".devcontainer/post-create.sh",
"features": {
"ghcr.io/devcontainers/features/node:1": {},
"ghcr.io/rails/devcontainer/features/mysql-client": {},
"ghcr.io/devcontainers-extra/features/ansible:2": {}
},
"containerEnv": {
// Because the database is hosted on the local network at the hostname `db`,
// we partially override `config/database.yml` to connect to `db`!
"DATABASE_URL_PRIMARY_DEV": "mysql2://db",
"DATABASE_URL_OPENNEO_ID_DEV": "mysql2://db",
"DATABASE_URL_PRIMARY_TEST": "mysql2://db",
"DATABASE_URL_OPENNEO_ID_TEST": "mysql2://db",
"DB_HOST": "mysql"
},
// HACK: Out of the box, this dev container doesn't allow installation to
// the default GEM_HOME, because of a weird thing going on with RVM.
// Instead, we set a custom GEM_HOME and GEM_PATH in our home directory!
// https://github.com/devcontainers/templates/issues/188
"GEM_HOME": "~/.rubygems",
"GEM_PATH": "~/.rubygems"
}
"remoteEnv": {
"IMPRESS_DEPLOY_USER": "${localEnv:USER}"
},
// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [3000],
// Configure tool-specific properties.
// "customizations": {},
// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
// "remoteUser": "root"
// Uncomment to connect as root instead. More info: https://containers.dev/implementors/json_reference/#remoteUser.
// "remoteUser": "root",
// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "bash .devcontainer/setup-ssh-config.sh && bin/setup --skip-server"
}

View file

@ -1,19 +0,0 @@
#!/usr/bin/env bash
set -e # Quit if any part of this script fails.
# Mark all git repositories as safe to execute, including cached gems.
# NOTE: This would be dangerous to run on a normal multi-user machine,
# but for a dev container that only we use, it should be fine!
git config --global safe.directory '*'
# Install the app's Ruby gem dependencies.
bundle install
# Set up the databases: create the schema, and load in some default data.
bin/rails db:schema:load db:seed
# Install the app's JS dependencies.
yarn install
# Run a first-time build of the app's JS, in development mode.
yarn build:dev

View file

@ -0,0 +1,28 @@
#!/bin/bash
# Creates SSH config for devcontainer to use host's SSH identity
# This allows `ssh impress.openneo.net` to work without hardcoding usernames
mkdir -p ~/.ssh
chmod 700 ~/.ssh
# Only create SSH config if IMPRESS_DEPLOY_USER is explicitly set
if [ -z "$IMPRESS_DEPLOY_USER" ]; then
echo "⚠️ IMPRESS_DEPLOY_USER not set - skipping SSH config creation."
echo " This should be automatically set from your host \$USER environment variable."
echo " See docs/deployment-setup.md for details."
exit 0
fi
cat > ~/.ssh/config <<EOF
# Deployment server config
# Username: ${IMPRESS_DEPLOY_USER}
Host impress.openneo.net
User ${IMPRESS_DEPLOY_USER}
ForwardAgent yes
# Add other host configurations as needed
EOF
chmod 600 ~/.ssh/config
echo "✓ SSH config created. Deployment username: ${IMPRESS_DEPLOY_USER}"

View file

@ -45,7 +45,8 @@ gem 'sanitize', '~> 6.0', '>= 6.0.2'
# For working with Neopets APIs.
# unstable version of RocketAMF interprets info registry as a hash instead of an array
gem 'RocketAMF', :git => 'https://github.com/rubyamf/rocketamf.git'
# Vendored version with Ruby 3.4 ARM compatibility fixes (see vendor/gems/README-RocketAMF.md)
gem 'RocketAMF', path: 'vendor/gems/RocketAMF-1.0.0'
# For preventing too many modeling attempts.
gem 'rack-attack', '~> 6.7'

View file

@ -1,8 +1,7 @@
GIT
remote: https://github.com/rubyamf/rocketamf.git
revision: 796f591d002b5cf47df436dbcbd6f2ab00e869ed
PATH
remote: vendor/gems/RocketAMF-1.0.0
specs:
RocketAMF (1.0.0)
RocketAMF (1.0.0.dti1)
GEM
remote: https://rubygems.org/

View file

@ -1,2 +1,2 @@
web: unset PORT && env RUBY_DEBUG_OPEN=true bin/rails server
web: unset PORT && env RUBY_DEBUG_OPEN=true bin/rails server -b 0.0.0.0
js: yarn dev

153
README.md
View file

@ -2,6 +2,155 @@
# Dress to Impress
Oh! We've been revitalizing the Rails app! Fun!
Dress to Impress (DTI) is a tool for designing Neopets outfits. Load your pet, browse items, and see how they look together—all with a mobile-friendly interface!
There'll be more to say about it here soon :3
## Architecture Overview
DTI is a Rails application with a React-based outfit editor, backed by MySQL databases and a crowdsourced data collection system.
### Core Components
- **Rails backend** (Ruby 3.4, Rails 8.0): Serves web pages, API endpoints, and manages data
- **MySQL databases**: Primary database (`openneo_impress`) + legacy auth database (`openneo_id`)
- **React outfit editor**: Embedded in `app/javascript/wardrobe-2020/`, provides the main customization UI
- **Modeling system**: Crowdsources pet/item appearance data by fetching from Neopets APIs when users load their pets
### The Impress 2020 Complication
In 2020, we started a NextJS rewrite ("Impress 2020") to modernize the frontend. We've since consolidated back into Rails, but **Impress 2020 still provides essential services**:
- **GraphQL API**: Some outfit appearance data still loads via GraphQL (being migrated to Rails REST APIs)
- **Image generation**: Runs a headless browser to render outfit thumbnails and convert HTML5 assets to PNGs
See [docs/impress-2020-dependencies.md](./docs/impress-2020-dependencies.md) for migration status.
## Key Concepts
### Customization Data Model
The core data model powers outfit rendering and item compatibility. See [docs/customization-architecture.md](./docs/customization-architecture.md) for details.
**Quick summary**:
- `body_id` is the key compatibility constraint (not species or color directly)
- Items have different `swf_assets` (visual layers) for different bodies
- Restrictions are subtractive: start with all layers, hide some based on zone restrictions
- Data is crowdsourced through "modeling" (users loading pets to contribute appearance data)
### Modeling (Crowdsourced Data)
DTI doesn't pre-populate item/pet data. Instead:
1. User loads a pet (via pet name lookup)
2. DTI fetches appearance data from Neopets APIs (legacy Flash/AMF protocol)
3. New `SwfAsset` records and relationships are created
4. Over time, the database learns which items fit which pet bodies
This "self-sustaining" approach means the site stays up-to-date as Neopets releases new content, without manual data entry.
## Directory Map
### Key Application Files
```
app/
├── controllers/
│ ├── outfits_controller.rb # Outfit editor + CRUD
│ ├── items_controller.rb # Item search, pages, and JSON APIs
│ ├── pets_controller.rb # Pet loading (triggers modeling)
│ └── closet_hangers_controller.rb # User item lists ("closets")
├── models/
│ ├── item.rb # Items + compatibility prediction logic
│ ├── pet_type.rb # Species+Color combinations (has body_id)
│ ├── pet_state.rb # Visual variants (pose/gender/mood)
│ ├── swf_asset.rb # Visual layers (biology/object)
│ ├── outfit.rb # Saved outfits + rendering logic (visible_layers)
│ ├── alt_style.rb # Alternative pet appearances (Nostalgic, etc.)
│ └── pet/
│ └── modeling_snapshot.rb # Processes Neopets API data into models
├── services/
│ ├── neopets/
│ │ ├── custom_pets.rb # Neopets AMF/Flash API client (pet data)
│ │ ├── nc_mall.rb # NC Mall item scraping
│ │ └── neopass.rb # NeoPass OAuth integration
│ ├── neopets_media_archive.rb # Local mirror of images.neopets.com
│ └── lebron_nc_values.rb # NC item trading values (external API)
├── javascript/
│ ├── wardrobe-2020/ # React outfit editor (extracted from Impress 2020)
│ │ ├── loaders/ # REST API calls (migrated from GraphQL)
│ │ ├── WardrobePage/ # Main editor UI
│ │ └── components/ # Shared React components
│ └── application.js # Rails asset pipeline entrypoint
└── views/
├── outfits/
│ └── edit.html.haml # Outfit editor page (loads React app)
├── items/
│ └── show.html.haml # Item detail page
└── closet_hangers/
└── index.html.haml # User closet/item lists
```
### Configuration & Docs
```
config/
├── routes.rb # All Rails routes
├── database.yml # Multi-database setup (main + openneo_id)
└── environments/
└── *.rb # Env-specific config (incl. impress_2020_origin)
```
**Documentation:**
- [docs/customization-architecture.md](./docs/customization-architecture.md) - Deep dive into data model & rendering
- [docs/impress-2020-dependencies.md](./docs/impress-2020-dependencies.md) - What still depends on Impress 2020 service
**Tests:**
- `test/` - Test::Unit tests (privacy features)
- `spec/` - RSpec tests (models, services, integrations)
- Coverage is focused on key areas: modeling, prediction logic, external APIs
- Not comprehensive, but thorough for critical behaviors
## Tech Stack
- **Backend**: Ruby on Rails (Ruby 3.4, Rails 8.0)
- **Frontend**: Mix of Rails views (Turbo/HAML) and React (for outfit editor)
- **Database**: MySQL (two databases: `openneo_impress`, `openneo_id`)
- **Styling**: CSS, Sass (moving toward modern Rails conventions)
- **External Integrations**:
- **Neopets.com**: Legacy Flash/AMF protocol for pet appearance data (modeling)
- **Neopets NC Mall**: Web scraping for NC item availability/pricing
- **NeoPass**: OAuth integration for Neopets account linking
- **Neopets Media Archive**: Local filesystem mirror of `images.neopets.com` (never discards old files)
- **Lebron's NC Values**: Third-party API for NC item trading values ([lebron-values.netlify.app](https://lebron-values.netlify.app))
- **Impress 2020**: GraphQL for some outfit data, image generation service (being phased out)
## Development Notes
### OpenNeo ID Database
The `openneo_id` database is a legacy from when authentication was a separate service ("OpenNeo ID") meant to unify auth across multiple OpenNeo projects. DTI was the only project that succeeded, so the apps were merged—but the database split remains for now.
**Implication**: Rails is configured for multi-database mode. User auth models live in `auth_user.rb` and connect to `openneo_id`.
### Rails/React Hybrid
Most pages are traditional Rails views using Turbo for interactivity. The **outfit editor** (`/outfits/new`) is a full React app that:
- Loads into a `#wardrobe-2020-root` div
- Uses React Query for data fetching
- Calls both Rails REST APIs (in `loaders/`) and Impress 2020 GraphQL (being migrated)
The goal is to simplify this over time—either consolidate into Rails+Turbo, or commit fully to React. For now, we're in a hybrid state.
## Deployment
- **Main app**: VPS running Rails (Puma, MySQL)
- **Impress 2020**: Separate VPS in same datacenter (NextJS, GraphQL, headless browser for images)
- Both services share the same MySQL database (Impress 2020 makes SQL calls over the network)
---
**Project maintained by [@matchu](https://github.com/matchu)** • **[OpenNeo.net](https://openneo.net)**

View file

@ -326,6 +326,12 @@ class Item < ApplicationRecord
PetType.basic.released_before(released_at_estimate).
distinct.pluck(:body_id).sort
else
# The core challenge: distinguish "item for Maraquan pets" from "item that
# happens to fit the Maraquan Mynci" (which shares a body with basic Myncis).
# We use a general rule: a color is "modelable" only if it has at least one
# *unique* body (not shared with other colors). This filters out false
# positives while remaining self-sustaining.
# First, find our compatible pet types, then pair each body ID with its
# color. (As an optimization, we omit standard colors, other than the
# basic colors. We also flatten the basic colors into the single color
@ -336,6 +342,7 @@ class Item < ApplicationRecord
Arel.sql("IF(colors.basic, 'basic', colors.id)"), :body_id)
# Group colors by body, to help us find bodies unique to certain colors.
# Example: {93 => ["basic"], 112 => ["maraquan"], 47 => ["basic", "maraquan"]}
compatible_color_ids_by_body_id = {}.tap do |h|
compatible_pairs.each do |(color_id, body_id)|
h[body_id] ||= []
@ -343,17 +350,19 @@ class Item < ApplicationRecord
end
end
# Find non-basic colors with at least one unique compatible body. (This
# means we'll ignore e.g. the Maraquan Mynci, which has the same body as
# the Blue Mynci, as not indicating Maraquan compatibility in general.)
# Find non-basic colors with at least one unique compatible body (size == 1).
# This means we'll predict "all Maraquan pets" only if the item fits a
# Maraquan pet with a unique body (like the Maraquan Acara), not if it only
# fits the Maraquan Mynci (which shares its body with basic Myncis).
modelable_color_ids =
compatible_color_ids_by_body_id.
filter { |k, v| v.size == 1 && v.first != "basic" }.
values.map(&:first).uniq
# We can model on basic pets (perhaps in addition to the above) if we
# find at least one compatible basic body that doesn't *also* fit any of
# the modelable colors we identified above.
# We can model on basic pets if we find a basic body that doesn't also fit
# any modelable colors. This way, if an item fits both basic Mynci and
# Maraquan Acara (a modelable color), we treat it as "Maraquan item" not
# "basic item", avoiding false predictions for all basic pets.
basic_is_modelable =
compatible_color_ids_by_body_id.values.
any? { |v| v.include?("basic") && (v & modelable_color_ids).empty? }

View file

@ -170,6 +170,11 @@ class Outfit < ApplicationRecord
end
def visible_layers
# TODO: This method doesn't currently handle alt styles! If the outfit has
# an alt_style, we should use its layers instead of pet_state layers, and
# filter items to only those with body_id=0. This isn't needed yet because
# this method is only used on item pages, which don't support alt styles.
# See useOutfitAppearance.js for the complete logic including alt styles.
item_appearances = item_appearances(swf_asset_includes: [:zone])
pet_layers = pet_state.swf_assets.includes(:zone).to_a

View file

@ -52,12 +52,42 @@ class Pet::ModelingSnapshot
id = @custom_pet[:alt_style].to_i
AltStyle.find_or_initialize_by(id:).tap do |alt_style|
pet_name = @custom_pet[:name]
# Capture old asset IDs before assignment
old_asset_ids = alt_style.swf_assets.map(&:remote_id).sort
# Assign new attributes and assets
new_asset_ids = alt_style_assets.map(&:remote_id).sort
alt_style.assign_attributes(
color_id: @custom_pet[:alt_color].to_i,
species_id: @custom_pet[:species_id].to_i,
body_id: @custom_pet[:body_id].to_i,
swf_assets: alt_style_assets,
)
# Log the modeling event using Rails' change tracking
if alt_style.new_record?
Rails.logger.info "[Alt Style Modeling] Created alt style " \
"ID=#{id} for pet=#{pet_name}: " \
"species_id=#{alt_style.species_id}, " \
"color_id=#{alt_style.color_id}, " \
"body_id=#{alt_style.body_id}, " \
"asset_ids=#{new_asset_ids.inspect}"
elsif alt_style.changes.any? || old_asset_ids != new_asset_ids
changes = []
changes << "species_id: #{alt_style.species_id_was} -> #{alt_style.species_id}" if alt_style.species_id_changed?
changes << "color_id: #{alt_style.color_id_was} -> #{alt_style.color_id}" if alt_style.color_id_changed?
changes << "body_id: #{alt_style.body_id_was} -> #{alt_style.body_id}" if alt_style.body_id_changed?
changes << "asset_ids: #{old_asset_ids.inspect} -> #{new_asset_ids.inspect}" if old_asset_ids != new_asset_ids
Rails.logger.warn "[Alt Style Modeling] Updated alt style " \
"ID=#{id} for pet=#{pet_name}. " \
"CHANGED: #{changes.join(', ')}"
else
Rails.logger.info "[Alt Style Modeling] Loaded alt style " \
"ID=#{id} for pet=#{pet_name} (no changes)"
end
end
end
end

View file

@ -1,25 +1,74 @@
require "addressable/template"
# Neopets::NCMall integrates with the Neopets NC Mall to fetch currently
# available items and their pricing.
#
# The integration works in two steps:
#
# 1. Category Discovery: We fetch the NC Mall homepage and extract the
# browsable categories from the embedded `window.ncmall_menu` JSON data.
# We filter out special feature categories (those with external URLs) and
# structural parent nodes (those without a cat_id).
#
# 2. Item Fetching: For each category, we call the v2 category API with
# pagination support. Large categories may span multiple pages, which we
# fetch in parallel and combine. Items can appear in multiple categories,
# so the rake task de-duplicates by item ID.
#
# The parsed item data includes:
# - id: Neopets item ID
# - name: Item display name
# - description: Item description
# - price: Regular price in NC (NeoCash)
# - discount: Optional discount info (price, begins_at, ends_at)
# - is_available: Whether the item is currently purchasable
#
# This module is used by the `neopets:import:nc_mall` rake task to sync our
# NCMallRecord table with the live NC Mall.
module Neopets::NCMall
# Load the NC Mall home page content area, and return its useful data.
HOME_PAGE_URL = "https://ncmall.neopets.com/mall/ajax/home_page.phtml"
def self.load_home_page
load_page_by_url HOME_PAGE_URL
end
# Load the NC Mall page for a specific type and category ID.
# Load the NC Mall page for a specific type and category ID, with pagination.
CATEGORY_PAGE_URL_TEMPLATE = Addressable::Template.new(
"https://ncmall.neopets.com/mall/ajax/load_page.phtml?lang=en{&type,cat}"
"https://ncmall.neopets.com/mall/ajax/v2/category/index.phtml{?type,cat,page,limit}"
)
def self.load_page(type, cat)
load_page_by_url CATEGORY_PAGE_URL_TEMPLATE.expand(type:, cat:)
def self.load_page(type, cat, page: 1, limit: 24)
url = CATEGORY_PAGE_URL_TEMPLATE.expand(type:, cat:, page:, limit:)
Sync do
DTIRequests.get(url) do |response|
if response.status != 200
raise ResponseNotOK.new(response.status),
"expected status 200 but got #{response.status} (#{url})"
end
parse_nc_page response.read
end
end
end
# Load the NC Mall root document HTML, and extract the list of links to
# other pages ("New", "Popular", etc.)
# Load all pages for a specific category.
def self.load_category_all_pages(type, cat, limit: 24)
# First, load page 1 to get total page count
first_page = load_page(type, cat, page: 1, limit:)
total_pages = first_page[:total_pages]
# If there's only one page, return it
return first_page[:items] if total_pages <= 1
# Otherwise, load remaining pages in parallel
Sync do
remaining_page_tasks = (2..total_pages).map do |page_num|
Async { load_page(type, cat, page: page_num, limit:) }
end
all_pages = [first_page] + remaining_page_tasks.map(&:wait)
all_pages.flat_map { |page| page[:items] }
end
end
# Load the NC Mall root document HTML, and extract categories from the
# embedded menu JSON.
ROOT_DOCUMENT_URL = "https://ncmall.neopets.com/mall/shop.phtml"
PAGE_LINK_PATTERN = /load_items_pane\(['"](.+?)['"], ([0-9]+)\).+?>(.+?)</
def self.load_page_links
MENU_JSON_PATTERN = /window\.ncmall_menu = (\[.*?\]);/m
def self.load_categories
html = Sync do
DTIRequests.get(ROOT_DOCUMENT_URL) do |response|
if response.status != 200
@ -31,11 +80,34 @@ module Neopets::NCMall
end
end
# Extract `load_items_pane` calls from the root document's HTML. (We use
# a very simplified regex, rather than actually parsing the full HTML!)
html.scan(PAGE_LINK_PATTERN).
map { |type, cat, label| {type:, cat:, label:} }.
uniq
# Extract the ncmall_menu JSON from the script tag
match = html.match(MENU_JSON_PATTERN)
unless match
raise UnexpectedResponseFormat,
"could not find window.ncmall_menu in homepage HTML"
end
begin
menu = JSON.parse(match[1])
rescue JSON::ParserError => e
Rails.logger.debug "Failed to parse ncmall_menu JSON: #{e.message}"
raise UnexpectedResponseFormat,
"failed to parse ncmall_menu as JSON"
end
# Flatten the menu structure, and filter to browsable categories
browsable_categories = flatten_categories(menu).
# Skip categories without a cat_id (structural parent nodes)
reject { |cat| cat['cat_id'].blank? }.
# Skip categories with external URLs (special features)
reject { |cat| cat['url'].present? }
# Map each category to include the API type (and remove load_type)
browsable_categories.map do |cat|
cat.except("load_type").merge(
"type" => map_load_type_to_api_type(cat["load_type"])
)
end
end
def self.load_styles(species_id:, neologin:)
@ -50,6 +122,26 @@ module Neopets::NCMall
private
# Map load_type from menu JSON to the v2 API type parameter.
def self.map_load_type_to_api_type(load_type)
case load_type
when "new"
"new_items"
when "popular"
"popular_items"
else
"browse"
end
end
# Flatten nested category structure (handles children arrays)
def self.flatten_categories(menu)
menu.flat_map do |cat|
children = cat["children"] || []
[cat] + flatten_categories(children)
end
end
STYLING_STUDIO_URL = "https://www.neopets.com/np-templates/ajax/stylingstudio/studio.php"
def self.load_styles_tab(species_id:, neologin:, tab:)
Sync do
@ -81,20 +173,7 @@ module Neopets::NCMall
end
end
def self.load_page_by_url(url)
Sync do
DTIRequests.get(url) do |response|
if response.status != 200
raise ResponseNotOK.new(response.status),
"expected status 200 but got #{response.status} (#{url})"
end
parse_nc_page response.read
end
end
end
# Given a string of NC page data, parse the useful data out of it!
# Given a string of v2 NC page data, parse the useful data out of it!
def self.parse_nc_page(nc_page_str)
begin
nc_page = JSON.parse(nc_page_str)
@ -104,24 +183,14 @@ module Neopets::NCMall
"failed to parse NC page response as JSON"
end
unless nc_page.has_key? "object_data"
raise UnexpectedResponseFormat, "missing field object_data in NC page"
# v2 API returns items in a "data" array
unless nc_page.has_key? "data"
raise UnexpectedResponseFormat, "missing field data in v2 NC page"
end
object_data = nc_page["object_data"]
item_data = nc_page["data"] || []
# NOTE: When there's no object data, it will be an empty array instead of
# an empty hash. Weird API thing to work around!
object_data = {} if object_data == []
# Only the items in the `render` list are actually listed as directly for
# sale in the shop. `object_data` might contain other items that provide
# supporting information about them, but aren't actually for sale.
visible_object_data = (nc_page["render"] || []).
map { |id| object_data[id.to_s] }.
filter(&:present?)
items = visible_object_data.map do |item_info|
items = item_data.map do |item_info|
{
id: item_info["id"],
name: item_info["name"],
@ -132,7 +201,12 @@ module Neopets::NCMall
}
end
{items:}
{
items:,
total_pages: nc_page["totalPages"].to_i,
page: nc_page["page"].to_i,
limit: nc_page["limit"].to_i,
}
end
# Given item info, return a hash of discount-specific info, if any.

View file

@ -12,7 +12,7 @@ FileUtils.chdir APP_ROOT do
# This script is idempotent, so that you can run it at any time and get an expectable outcome.
# Add necessary setup steps to this file.
puts "== Installing dependencies =="
puts "== Installing Ruby dependencies =="
system("bundle check") || system!("bundle install")
# puts "\n== Copying sample files =="
@ -23,6 +23,17 @@ FileUtils.chdir APP_ROOT do
puts "\n== Preparing database =="
system! "bin/rails db:prepare"
puts "\n== Importing public modeling data =="
system! "bin/rails public_data:pull"
puts "\n== Installing Yarn dependencies =="
system! "corepack enable"
system! "corepack install"
system! "yarn install"
puts "\n== Building development JS files =="
system! "yarn build:dev"
puts "\n== Removing old logs and tempfiles =="
system! "bin/rails log:clear tmp:clear"

View file

@ -1,12 +1,9 @@
development:
primary:
# You can override these default settings with this environment variable,
# fully or partially. We do this in the .devcontainer setup!
url: <%= ENV['DATABASE_URL_PRIMARY_DEV'] %>
adapter: mysql2
host: <%= ENV.fetch("DB_HOST", "localhost") %>
database: openneo_impress
username: impress_dev
password: impress_dev
username: root
pool: 5
encoding: utf8mb4
collation: utf8mb4_unicode_520_ci
@ -14,13 +11,10 @@ development:
sql_mode: TRADITIONAL
openneo_id:
# You can override these default settings with this environment variable,
# fully or partially. We do this in the .devcontainer setup!
url: <%= ENV['DATABASE_URL_OPENNEO_ID_DEV'] %>
adapter: mysql2
host: <%= ENV.fetch("DB_HOST", "localhost") %>
database: openneo_id
username: impress_dev
password: impress_dev
username: root
pool: 2
variables:
sql_mode: TRADITIONAL
@ -28,13 +22,10 @@ development:
test:
primary:
# You can override these default settings with this environment variable,
# fully or partially. We do this in the .devcontainer setup!
url: <%= ENV['DATABASE_URL_PRIMARY_TEST'] %>
adapter: mysql2
host: <%= ENV.fetch("DB_HOST", "localhost") %>
database: openneo_impress_test
username: impress_dev
password: impress_dev
username: root
pool: 5
encoding: utf8mb4
collation: utf8mb4_unicode_520_ci
@ -42,13 +33,10 @@ test:
sql_mode: TRADITIONAL
openneo_id:
# You can override these default settings with this environment variable,
# fully or partially. We do this in the .devcontainer setup!
url: <%= ENV['DATABASE_URL_OPENNEO_ID_TEST'] %>
adapter: mysql2
host: <%= ENV.fetch("DB_HOST", "localhost") %>
database: openneo_id_test
username: impress_dev
password: impress_dev
username: root
pool: 2
variables:
sql_mode: TRADITIONAL

View file

@ -23,7 +23,7 @@
# 6. Update the service file manually to reference the newly-uploaded version by path.
# 7. Link the new version as `current` manually.
# 8. Reset the service file to use the new Ruby to run `current`.
skip_set_as_current: yes
skip_set_as_current: no
tasks:
- name: Generate a version name from the current timestamp
command: date '+%Y-%m-%d-%s'

View file

@ -15,3 +15,4 @@
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIu5a+mp2KKSGkOGWQPrARCrsqJS4g2vK7TmRIbj/YBh Matchu's Desktop (Leviathan 2023)
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKFwWryq6slOQqkrJ7HIig7BvEQVQeH19hFwb+9VpXgz Matchu's Laptop (Ebon Hawk)
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINq0HDYIUwRnrlKBWyGWJbJsx3M8nLg4nRxaA+9lJp+o Matchu's Laptop (Death Star)
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEFCEvr4e0uPpDtySU5qb3NkmSXNaRDSRg4WN2o8QJmF matchu@openneo.net

View file

@ -449,12 +449,12 @@
minute: "*/10"
job: "bash -c 'source /etc/profile && source ~/.bash_profile && cd /srv/impress/current && bin/rails nc_mall:sync'"
- name: Create 10min cron job to run `rails neopets:import:nc_mall`
- name: Create 10min cron job to run `rails neopets:import`
become_user: impress
cron:
name: "Impress: import NC Mall data"
name: "Impress: import Neopets data"
minute: "*/10"
job: "bash -c 'source /etc/profile && source ~/.bash_profile && cd /srv/impress/current && bin/rails neopets:import:nc_mall'"
job: "bash -c 'source /etc/profile && source ~/.bash_profile && cd /srv/impress/current && bin/rails neopets:import'"
- name: Create weekly cron job to run `rails public_data:commit`
become_user: impress

View file

@ -0,0 +1,358 @@
# Dress to Impress: Customization System Architecture
Dress to Impress (DTI) models Neopets's pet customization system: layered 2D images of pets wearing clothing items. This guide explains how DTI's data models represent the customization system and how they combine to render outfit images.
## Core Models
The customization system is built on these key models:
### Species and Color
- **Species**: The type of Neopet (Acara, Zafara, etc.)
- **Color**: The paint color applied to a pet (Blue, Maraquan, Halloween, etc.)
- Colors have flags: `basic` (starter colors like Blue/Red), `standard` (follows typical body shape), `nonstandard` (unusual shapes)
### PetType
**PetType = Species + Color**
- Represents the combination of a species and color (e.g., "Blue Acara", "Maraquan Acara")
- **Contains the critical `body_id` field**: the physical shape/compatibility ID
- Body ID determines what clothing items are compatible with this pet
- Example: "Blue Acara" and "Red Acara" share the same body_id (they have the same shape), but "Maraquan Acara" has a different body_id (different shape)
### PetState
**PetState = A visual variant of a PetType**
- Represents different presentations: gender (feminine/masculine) and mood (happy/sad/sick)
- Standard poses: `HAPPY_FEM`, `HAPPY_MASC`, `SAD_FEM`, `SAD_MASC`, `SICK_FEM`, `SICK_MASC`
- Special case: `UNCONVERTED` - legacy pets from before the customization system (now mostly replaced by Alt Styles)
- Has many `swf_assets` (the actual visual layers for the pet's appearance)
### Item
- Represents a wearable clothing item
- Can have different appearances for different body IDs
- Tracks `cached_compatible_body_ids`: which bodies this item has been seen on
- Tracks `cached_occupied_zone_ids`: which zones this item's layers occupy
### Item.Appearance
**Not a database model** - a `Struct` that represents "this item on this body"
- Created on-the-fly when rendering
- Contains: the item, the body (id + species), and the relevant `swf_assets`
- Why it exists: items can look completely different on different bodies
### SwfAsset
**The actual visual layer** - a single image in the final composite
- Two types (via `type` field):
- `'biology'`: Pet appearance layers (tied to PetStates)
- `'object'`: Item appearance layers (tied to Items)
- Has a `body_id`: either a specific body, or `0` meaning "fits all bodies"
- Belongs to a `Zone` (which determines rendering depth/order)
- Contains URLs for the image assets (Flash SWF legacy, HTML5 canvas/SVG modern)
- Has `zones_restrict`: a bitfield indicating which zones this asset restricts
### Zone
- Defines a layer position in the rendering stack
- Has a `depth`: lower depths render behind, higher depths render in front
- Has a `label`: human-readable name (e.g., "Hat", "Background")
- Multiple zones can share the same label but have different depths (for items that "wrap around" pets)
### AltStyle
**Alternative pet appearance** - a newer system for non-standard pet looks
- Used for "Nostalgic" (pre-customization) appearances and other special styles
- Replaces the pet's normal layers entirely (not additive)
- Has its own `body_id` (distinct from regular pet body IDs)
- Most items are incompatible - only `body_id=0` items work with alt styles
- Example: "Nostalgic Grey Zafara" is an AltStyle, not a PetState
---
## The Rendering Pipeline
This is the core of DTI's customization system: how we turn database records into layered outfit images.
### Overview
**Input**: An `Outfit` (a pet appearance + a list of worn items)
**Output**: A sorted list of `SwfAsset` layers to render, bottom-to-top
### Step 1: Choose the Biology Layers
```ruby
# If alt_style is present, use its layers; otherwise use pet_state's layers
biology_layers = outfit.alt_style ? outfit.alt_style.swf_assets : outfit.pet_state.swf_assets
```
- Alt styles completely replace pet layers (they don't layer on top)
- Regular pets use their pet_state's layers
- All biology layers have `type = 'biology'`
### Step 2: Load Item Appearances
```ruby
# Get how each worn item looks on this body
body_id = outfit.alt_style ? outfit.alt_style.body_id : outfit.pet_type.body_id
item_appearances = Item.appearances_for(outfit.worn_items, body_id)
item_layers = item_appearances.flat_map(&:swf_assets)
```
- For each worn item, find the `swf_assets` that match this body
- Matching logic: asset's `body_id` must equal the pet's `body_id`, OR asset's `body_id = 0` (fits all)
- For alt styles: only `body_id=0` items will match (body-specific items are incompatible)
- All item layers have `type = 'object'`
### Step 3: Apply Restriction Rules
This is where it gets complex. We need to hide certain layers based on zone restrictions.
#### Rule 3a: Items Restrict Pet Layers
```ruby
# Collect all zones that items restrict
item_restricted_zone_ids = item_appearances.flat_map(&:restricted_zone_ids)
# Hide pet layers in those zones
biology_layers.reject! { |layer| item_restricted_zone_ids.include?(layer.zone_id) }
```
**Example**: The "Zafara Agent Hood" restricts the "Hair Front" and "Head Transient Biology" zones.
**How it works**:
- Items have a `zones_restrict` bitfield indicating which zones they restrict
- When an item restricts zone 5, all pet layers in zone 5 are hidden
- This allows items to "replace" parts of the pet's appearance
#### Rule 3b: Pets Restrict Body-Specific Item Layers
This rule is asymmetric and more complex!
> Note: This is a legacy rule, originally built for Unconverted pets.
> Now, Unconverted pets don't exist… but pet states *do* still technically
> support zone restrictions. We should examine whether any existing pet
> states still use this feature, and consider simplifying it out of the
> system if not.
```ruby
# Collect all zones that the pet restricts
pet_restricted_zone_ids = biology_layers.flat_map(&:restricted_zone_ids)
# Special case: Unconverted pets can't wear ANY body-specific items
if pet_state.pose == "UNCONVERTED"
item_layers.reject! { |layer| layer.body_specific? }
else
# Other pets: hide body-specific items only in restricted zones
item_layers.reject! do |layer|
layer.body_specific? && pet_restricted_zone_ids.include?(layer.zone_id)
end
end
```
**Example**: Unconverted pets can wear backgrounds (body_id=0) but not species-specific clothing.
**Key distinction**:
- Items restricting zones → hide **pet** layers
- Pets restricting zones → hide **body-specific item** layers
- Unconverted pets are special: they reject ALL body-specific items, regardless of zone
**Why `body_specific?` matters**:
- Items with `body_id=0` fit everyone and are never hidden by pet restrictions
- Items with specific body IDs are "body-specific" and can be hidden
#### Rule 3c: Pets Restrict Their Own Layers
```ruby
# Pets can hide parts of themselves too
biology_layers.reject! { |layer| pet_restricted_zone_ids.include?(layer.zone_id) }
```
**Example**: The Wraith Uni has a horn asset, but its zone restrictions hide it.
**Why this exists**: Sometimes the pet data includes layers that shouldn't be visible, probably to enable the Neopets team to more easily manage certain kinds of appearance assets. This allows the pet's metadata to control which of its own layers render.
### Step 4: Sort by Depth and Render
```ruby
all_layers = biology_layers + item_layers
all_layers.sort_by(&:depth)
```
- Pet layers first, then item layers (maintains order for same-depth assets)
- Sort by zone depth: lower depths behind, higher depths in front
- The sorted list is the final render order: first layer on bottom, last layer on top
**Important**: When a pet layer and item layer share the same zone (and thus the same depth), the item appears on top. This is achieved by putting item layers second in the concatenation and relying on sort stability.
---
## Understanding Compatibility
### Why body_id Is the Key
You might expect items to be compatible with a *species* (all Acaras) or a *color* (all Maraquan pets). But the system uses **body_id** instead.
**Why?**
- Some colors share the same body shape across species (e.g., most "Blue" pets follow a standard shape)
- Some colors have unique body shapes (e.g., Maraquan pets are aquatic and need different clothing)
- Some species have unique shapes even in "standard" colors
- Body ID captures the actual physical compatibility, regardless of color or species
**Example**:
- Blue Acara (body_id: 93) and Red Acara (body_id: 93) → same body, share items
- Blue Acara (body_id: 93) and Maraquan Acara (body_id: 112) → different bodies, different items
- An item compatible with body 93 fits both Blue and Red Acaras
### Body ID 0: Fits All
- Items with `body_id=0` assets fit every pet
- Examples: backgrounds, foregrounds, trinkets
- These are the only items compatible with alt styles
### Standard vs. Nonstandard Bodies
- **Standard bodies**: Follow typical species shapes (usually "Blue" or other basic colors)
- **Nonstandard bodies**: Unusual shapes (Maraquan, Baby, Mutant, etc.)
- This distinction helps with modeling predictions (more on that below)
### How Compatibility Is Discovered
DTI doesn't know in advance which items fit which bodies. Instead, **users contribute compatibility data through "modeling"**:
1. A user loads their pet wearing an item
2. DTI's backend calls Neopets APIs to fetch the pet's appearance data
3. The response includes which asset IDs are present
4. DTI records: "Item X has asset Y for body_id Z"
5. Over time, the database learns which items fit which bodies
This crowdsourced approach is why DTI is "self-sustaining" - users passively contribute data just by using the site.
---
## Modeling and Data Sources
### The Modeling Process
**"Modeling"** is DTI's term for crowdsourcing appearance data from users:
1. User submits a pet name, often to use as the base of a new outfit
2. DTI makes an API request to Neopets using the legacy Flash/AMF protocol
3. Response contains:
- Pet's species, color, mood, gender
- List of biology asset IDs (the pet's appearance)
- List of object asset IDs (items the pet is wearing)
- Metadata for each asset (zone, manifest URLs, etc.)
4. DTI creates/updates records:
- `PetType` (if new species+color combo)
- `PetState` (if new pose/mood/gender combo)
- `SwfAsset` records for each asset
- `ParentSwfAssetRelationship` linking assets to pets/items
See `app/models/pet/modeling_snapshot.rb` for the full implementation.
### Cached Fields
To avoid expensive queries, several models cache computed data:
- **Item**:
- `cached_compatible_body_ids`: Which bodies we've seen this item on
- `cached_occupied_zone_ids`: Which zones this item's assets occupy
- `cached_predicted_fully_modeled`: Whether we think we've seen all compatible bodies
- **PetState**:
- `swf_asset_ids`: Direct list of asset IDs (for avoiding duplicate states)
These fields are updated automatically when new modeling data arrives.
### Prediction Logic
DTI tries to predict which bodies an item *should* work on, based on which bodies we've already seen it modeled on. This helps prioritize modeling work and estimate how "complete" an item's data is.
#### The Maraquan Mynci Problem
The core challenge is avoiding **false positives**. Consider: most Maraquan pets have unique, aquatic body shapes and wear special Maraquan-themed items. But the Maraquan Mynci shares the same body_id as basic Myncis - it can wear any standard Mynci item.
**Naive approach**: "We saw this item on a Maraquan pet, so predict it fits all Maraquan pets"
**Problem**: Most items fit the Maraquan Mynci but NOT other Maraquan pets!
#### The Solution: Unique Body Detection
For each item, we check whether **this item's modeling data** shows a color in a "modelable" way: the item must fit at least one body_id that ONLY that color uses (not shared with any other color).
**Example 1 - Real Maraquan item**:
- Modeled on: Maraquan Acara (body 112), Maraquan Zafara (body 98)
- Both bodies are unique to Maraquan (no basic pets share them)
- **Prediction**: This fits all Maraquan pets ✓
**Example 2 - Standard Mynci item**:
- Modeled on: Blue Mynci (body 47), Maraquan Mynci (body 47)
- Body 47 is shared by basic and Maraquan
- Maraquan has NO unique body for this item
- **Prediction**: This is a standard item for Myncis, not a Maraquan item ✓
**Example 3 - Maraquan item on Mynci**:
- Modeled on: Maraquan Acara (body 112), basic Mynci (body 47)
- Body 112 is unique to Maraquan
- Maraquan HAS a unique body (the Acara)
- **Prediction**: This fits all Maraquan pets (including Mynci) ✓
#### Basic Color Handling
Basic colors (Blue, Red, Green, etc.) always share the same body IDs in practice, so they're treated as a group. An item is predicted to fit all basic pets if we find at least one basic body that doesn't also fit any of the modelable colors identified above.
This prevents false positives: if an item fits both a unique Maraquan body AND a basic/Maraquan shared body (like the Mynci), we treat it as a Maraquan item, not a universal basic item.
#### Edge Cases
The algorithm has several early exits:
1. **Manual override**: If `modeling_status_hint` is set to "done", trust current data is complete
2. **Fits all** (`body_id=0`): Item fits everyone, prediction complete
3. **Single body**: Only one body seen - could be species-specific, stay conservative
4. **No data**: No bodies yet - optimistically predict all basic bodies for recently-released items
#### Why This Works
This approach is **self-sustaining**: no manual color flagging needed. As users model items, the unique body pattern emerges naturally, and predictions improve automatically.
See `Item#predicted_body_ids` in `app/models/item.rb:306-373` for the full implementation.
### Flash → HTML5 Transition
Neopets originally used Flash (SWF files) for customization. Over time, they've migrated to HTML5 (Canvas/SVG).
**SwfAsset fields**:
- `url`: Legacy SWF file URL (mostly unused now)
- `manifest_url`: JSON manifest for HTML5 assets
- `has_image`: Whether we generated a PNG from the SWF (legacy)
**Modern rendering**:
1. Load the manifest JSON from `manifest_url`
2. Parse it to find canvas library JS, SVG files, or PNG images
3. Use Impress 2020 (the frontend) to render them
**Fallbacks**:
- If `manifest_url` is missing or 404s, fall back to legacy PNG images
- If those are missing too, the asset can't be rendered
---
## Summary: Key Takeaways
1. **body_id is the compatibility key**, not species or color
2. **Restrictions are subtractive**: start with all layers, then hide some via restriction rules
3. **Restrictions are asymmetric**: items hide pet layers, pets hide body-specific items
4. **Unconverted pets are special**: they reject all body-specific items (and are no longer available on Neopets)
5. **Alt styles replace pet layers** and only work with body_id=0 items
6. **Data is crowdsourced** through user modeling, not pre-populated
7. **The system evolved over time**: Flash→HTML5, UC→Alt Styles, etc.
---
## Code References
- **Rendering logic**: `Outfit#visible_layers` in `app/models/outfit.rb`
- **Item appearances**: `Item.appearances_for` in `app/models/item.rb`
- **Modeling**: `Pet::ModelingSnapshot` in `app/models/pet/modeling_snapshot.rb`
- **Body compatibility**: `Item#compatible_body_ids`, `Item#predicted_body_ids` in `app/models/item.rb`
- **Pet state poses**: `PetState#pose`, `PetState.with_pose` in `app/models/pet_state.rb`

151
docs/deployment-setup.md Normal file
View file

@ -0,0 +1,151 @@
# Deployment Setup Guide
This guide covers how to set up deployment access to Dress to Impress from a new machine.
## Overview
The deployment system uses Ansible playbooks to automate deployment to `impress.openneo.net`. You'll need:
- SSH key authorized on the production server
- Secret files for configuration and credentials
- Ansible and build dependencies installed locally
## Deploying from Devcontainer
The devcontainer is configured to forward your host's SSH agent, allowing you to deploy without copying keys into the container.
**Setup for Deployers:**
If you have deploy access to the production server, the devcontainer is pre-configured to work automatically:
1. The `IMPRESS_DEPLOY_USER` environment variable is automatically set from your host's `$USER` environment variable
2. On container creation, this creates `~/.ssh/config` inside the container with your username
3. Verify SSH access: `ssh impress.openneo.net whoami` should return your username
Then rebuild the devcontainer.
**For Contributors (Non-Deployers):**
If you don't have deploy access, you don't need to do anything. The devcontainer will skip SSH config creation and show a warning, but this won't affect your ability to develop.
## Setup Checklist for New Machine
If you have an existing machine with deploy access, this workflow lets you bootstrap a new machine by authorizing its SSH key and copying secret files.
### Phase 1: On New Machine
**Prerequisites to Install:** (or use devcontainer)
- [ ] Install Ansible
- [ ] Install Ruby 3.4+
- [ ] Install Node.js/Yarn
**Generate SSH Key:**
- [ ] Generate new SSH key pair: `ssh-keygen -t ed25519 -f ~/.ssh/id_impress_deploy`
- [ ] Copy the **public key** (`~/.ssh/id_impress_deploy.pub`) to a location accessible from your existing machine
- Example: USB drive, shared network location, or transfer via secure method
### Phase 2: On Existing Machine (Has Deploy Access)
**Add New SSH Key:**
- [ ] Copy the public key from the transfer location to the repository
- [ ] Append public key to `deploy/files/authorized-ssh-keys.txt`
- [ ] Run `bin/deploy:setup` to sync the updated authorized keys to the server
- [ ] Test that the new key works: `ssh -i /path/to/new/private/key impress.openneo.net`
**Copy Secret Files to Transfer Location:**
- [ ] Copy `deploy/files/production.env` to transfer location
- [ ] Copy `deploy/files/setup_secrets.yml` to transfer location (optional, only needed for full setup)
### Phase 3: Return to New Machine
**Copy Secret Files:**
- [ ] Copy `production.env` from transfer location to `deploy/files/production.env`
- [ ] Copy `setup_secrets.yml` from transfer location to `deploy/files/setup_secrets.yml` (optional)
- [ ] `ln -s deploy/files/production.env .env.production`
- [ ] Set proper permissions: `chmod 600` on all secret files
**Configure SSH:**
- [ ] Ensure SSH key has proper permissions: `chmod 600 ~/.ssh/id_impress_deploy`
- [ ] (Optional) Add to `~/.ssh/config`:
```
Host impress.openneo.net
IdentityFile ~/.ssh/id_impress_deploy
```
**Verify Access:**
- [ ] Test SSH connection: `ssh impress.openneo.net`
- [ ] Verify deployment works: `bin/deploy`
### Security Reminder
- [ ] Delete secret files from transfer location after copying
- [ ] Optionally remove the temporary public key file from the existing machine
---
## Secret Files Reference
These files are gitignored and must be obtained from an existing deployment machine or production server:
### 1. `deploy/files/production.env`
Production environment variables deployed to `/srv/impress/shared/production.env` on the server.
Required variables:
- `DATABASE_URL_PRIMARY` - MySQL connection URL for primary database
- `DATABASE_URL_OPENNEO_ID` - MySQL connection URL for auth database
- `IMPRESS_2020_ORIGIN` - URL of Impress 2020 service (optional, defaults to `https://impress-2020.openneo.net`)
- `RAILS_LOG_LEVEL` - Log level (optional, defaults to "info")
### 2. `deploy/files/setup_secrets.yml`
Ansible variables for the setup playbook. Only needed if running initial server setup (`bin/deploy:setup`).
Required variables:
- `mysql_root_password` - Root password for MySQL/MariaDB
- `mysql_user_password` - Password for `openneo_impress` MySQL user
- `mysql_user_password_2020` - Password for `impress2020` MySQL user
- `dev_ips` - List of IP addresses allowed to connect to MySQL
Decrypts credentials including:
- `matchu_email_password` - SMTP password for `matchu@openneo.net` (Fastmail)
- `neopass.client_secret` - OAuth client secret for NeoPass integration
## Deployment Commands
### Initial Server Setup (one-time)
```bash
bin/deploy:setup
```
Runs the Ansible setup playbook to configure the server (Ruby, MySQL, nginx, SSL, users, firewall, etc.). Prompts for sudo password.
### Deploy New Version
```bash
bin/deploy
```
Compiles assets locally, then deploys to production. No sudo required.
### Rollback
```bash
bin/deploy:rollback [version-name]
```
Rolls back to a previous deployment.
## Server Details
- **Host:** `impress.openneo.net`
- **OS:** Ubuntu 20.04 LTS
- **Ruby:** 3.4.5
- **App location:** `/srv/impress/`
- **Deploy user:** `impress`
- **Service:** systemd service named `impress`
## Troubleshooting
**SSH Permission Denied:**
- Verify your SSH key is in `authorized-ssh-keys.txt` and `bin/deploy:setup` was run
- Check key permissions: `chmod 600 ~/.ssh/id_impress_deploy`
- Test with verbose output: `ssh -v impress.openneo.net`
**Asset Compilation Fails:**
- Ensure Ruby and Node.js/Yarn are installed
**Deployment Permission Errors:**
- Verify your user is in the `impress-deployers` group on the server
- This is set up automatically by `bin/deploy:setup`

View file

@ -0,0 +1,158 @@
# Impress 2020 Dependencies
This document tracks how the main DTI Rails app still depends on the separate Impress 2020 service, and what would need to be migrated to fully consolidate into a single Rails app.
## Background
In 2020, we started a NextJS rewrite called "Impress 2020" to modernize the frontend. We've since decided to consolidate back into the Rails app, but migration is ongoing. The Rails app now embeds the Impress 2020 React frontend (in `app/javascript/wardrobe-2020/`) for the outfit editor, but some functionality still calls back to the Impress 2020 GraphQL API.
## Current State
### What's Migrated to Rails
The following have been migrated to Rails REST API endpoints (in `app/javascript/wardrobe-2020/loaders/`):
- **Item search** (`/items.json`) - Searching for items with filters
- **Item appearances** (`/items/:id/appearances.json`) - Getting item layers for different bodies
- **Outfit save/load** (`/outfits.json`, `/outfits/:id.json`) - CRUD operations on outfits
- **Alt styles** (`/species/:id/alt-styles.json`) - Loading alternative pet appearances
### What Still Uses Impress 2020 GraphQL
The following GraphQL queries and mutations are still hitting the Impress 2020 service:
#### Core Wardrobe Functionality
- **`OutfitPetAppearance`** - Load pet appearance (biology layers) by species/color/pose
- **`OutfitPetAppearanceById`** - Load pet appearance by ID
- **`OutfitItemsAppearance`** - Load item appearances (object layers) for worn items
- **`OutfitStateItems`** - Load item metadata for the outfit state
- **`OutfitStateItemConflicts`** - Check for conflicting items in zones
- **`PosePicker`** - Load available poses for a species/color combination
- **`SpeciesColorPicker`** - Load all species and colors for the picker UI
- **`SearchToolbarZones`** - Load all zones for search filtering
- **`OutfitThumbnailIfCached`** - Check if an outfit thumbnail exists in the cache
#### Support/Admin Tools
These are staff-only features for managing modeling data:
- **`ItemSupportFields`** - Load item data for support drawer
- **`ItemSupportRestrictedZones`** - Load restricted zones for an item
- **`ItemSupportDrawerAllColors`** - Load all colors for support tools
- **`PosePickerSupport`** - Load pet appearance data for labeling
- **`PosePickerSupportRefetchCanonicalAppearances`** - Refresh canonical appearance data
- **`AllItemLayersSupportModal`** - Load all layers for an item (support view)
- **`AllItemLayersSupportModal_BulkAddProposal`** - Preview bulk layer additions
- **`WriteItemFromLoader`** - Write item data to Apollo cache (used after REST calls)
#### Support/Admin Mutations
- **`ItemSupportDrawerSetItemExplicitlyBodySpecific`** - Mark item as body-specific
- **`ItemSupportDrawerSetManualSpecialColor`** - Set manual special color for item
- **`PosePickerSupportSetPetAppearancePose`** - Update pet appearance pose label
- **`PosePickerSupportSetPetAppearanceIsGlitched`** - Mark pet appearance as glitched
- **`ApperanceLayerSupportSetLayerBodyId`** - Update layer's body ID
- **`AppearanceLayerSupportRemoveButton`** - Remove a layer
- **`AllItemLayersSupportModal_BulkAddMutation`** - Bulk add layers to bodies
### Image Services
Impress 2020 also provides image generation services:
- **Outfit thumbnails** - Generates PNG images of outfits at various sizes (150px, 300px, 600px)
- **Asset image rendering** - Runs a headless browser to convert HTML5 canvas movies to static images
These are served via:
- Production: `https://outfits.openneo-assets.net/outfits/:id/v/:timestamp/:size.png`
- Development: `Rails.configuration.impress_2020_origin + /api/outfitImage?id=...`
## Migration Strategy
There are two potential paths forward:
### Option A: Incremental GraphQL → REST Migration
Continue migrating GraphQL queries to Rails REST endpoints, keeping the React wardrobe.
**Pros**: Lower risk, incremental progress, preserves mobile-friendly UI
**Cons**: Maintains complexity of React app + dual API surface
### Option B: Wardrobe Rewrite (Rails + Turbo)
Rewrite the outfit editor as a Rails view with Turbo/Stimulus, similar to the item show page.
**Pros**: Massive simplification—remove React, GraphQL, and complex data fetching entirely
**Cons**: High risk (rewrites are dangerous), significant effort, potential UI regressions
**Note**: Simplicity has been DTI's most valuable architectural principle long-term. The complexity of maintaining the React wardrobe + its APIs is significant. But rewrites carry inherent risk.
---
### Option A: Priority 1 - Core Data Loading
The most important migrations to enable turning off Impress 2020 would be:
1. **Pet appearances** (`OutfitPetAppearance`, `OutfitPetAppearanceById`)
- Backend: Create `/pet-types/:species/:color/appearances.json` or similar
- Frontend: Update `useOutfitAppearance.js` to use REST loader
2. **Item appearance layers** (`OutfitItemsAppearance`)
- May already be covered by `/items/:id/appearances.json`?
- Need to verify if this is redundant with existing loader
3. **Species/Color/Zone metadata** (`SpeciesColorPicker`, `SearchToolbarZones`)
- Backend: Create endpoints for species, colors, zones listings
- Or embed this static data in the JS bundle
4. **Pose availability** (`PosePicker`)
- Backend: Add pose data to pet type endpoints
- Frontend: Update pose picker to use REST data
### Priority 2: Support Tools
Support tools could be migrated as Rails admin pages using Turbo/Stimulus:
- Pet state labeling (pose picker support)
- Item layer management
- Manual data corrections
These don't need to be React-based; simpler Rails views would work fine.
### Priority 3: Image Services
This is the most complex migration:
- Move headless browser rendering into a Rails service or separate microservice
- Set up image storage (S3 or similar)
- Update outfit image URLs to point to new service
## Deployment Architecture
### Current Setup
- **Main Rails app**: Primary VPS server, serves web traffic and API
- **Impress 2020**: Separate VPS in same datacenter, provides GraphQL API and image services
- **Database**: MySQL on main Rails server, accessed by both services
- **OpenNeo ID database**: Separate MySQL database (legacy, could be merged)
### After Full Migration
- **Single Rails app**: One VPS serving everything
- **Image service**: Either integrated into Rails or extracted as a simple microservice
- **Single MySQL database**: Merge OpenNeo ID schema into main database
## Notes
- The wardrobe-2020 frontend is already embedded in Rails (`app/javascript/wardrobe-2020/`)
- Many API calls have been successfully migrated from GraphQL to REST
- The GraphQL dependency is primarily in the core outfit rendering logic
- Support tools are the lowest priority since they're staff-only
## See Also
- [Customization Architecture](./customization-architecture.md) - Explains the data model
- `app/javascript/wardrobe-2020/loaders/` - Migrated REST API calls
- `config/routes.rb` - Rails API endpoints

View file

@ -22,9 +22,26 @@ namespace :neopets do
]
namespace :import do
# Gets the neologin cookie, either from ENV['NEOLOGIN_COOKIE'] or by prompting.
# The neologin cookie is required for authenticated Neopets requests (Rainbow Pool,
# Styling Studio). It's generally long-lived (~1 year), so it can be stored in the
# environment and rotated manually when it expires.
#
# To extract the cookie:
# 1. Log into Neopets.com in your browser
# 2. Open browser DevTools > Application/Storage > Cookies
# 3. Find the "neologin" cookie value
# 4. Set NEOLOGIN_COOKIE environment variable to that value
# 5. Update production.env and redeploy when the cookie expires
task :neologin do
unless Neologin.cookie?
Neologin.cookie = STDIN.getpass("Neologin cookie: ")
# Try environment variable first (for automated cron jobs)
if ENV['NEOLOGIN_COOKIE'].present?
Neologin.cookie = ENV['NEOLOGIN_COOKIE']
else
# Fall back to interactive prompt (for local development)
Neologin.cookie = STDIN.getpass("Neologin cookie: ")
end
end
end
end

View file

@ -1,97 +1,109 @@
namespace "neopets:import" do
desc "Sync our NCMallRecord table with the live NC Mall"
task :nc_mall => :environment do
# Log to STDOUT.
Rails.logger = Logger.new(STDOUT)
begin
# Log to STDOUT.
Rails.logger = Logger.new(STDOUT)
puts "Importing from NC Mall…"
puts "Importing from NC Mall…"
# First, load all records of what's being sold in the live NC Mall. We load
# the homepage and all pages linked from the main document, and extract the
# items from each. (We also de-duplicate the items, which is important
# because the algorithm expects to only process each item once!)
pages = load_all_nc_mall_pages
live_item_records = pages.map { |p| p[:items] }.flatten.uniq
# First, load all records of what's being sold in the live NC Mall. We load
# all categories from the menu and fetch all items from each. (We also
# de-duplicate the items, which is important because the same item can
# appear in multiple categories!)
live_item_records = load_all_nc_mall_items.uniq { |item| item[:id] }
# Then, get the existing NC Mall records in our database. (We include the
# items, to be able to output the item name during logging.)
existing_records = NCMallRecord.includes(:item).all
existing_records_by_item_id = existing_records.to_h { |r| [r.item_id, r] }
# Then, get the existing NC Mall records in our database. (We include the
# items, to be able to output the item name during logging.)
existing_records = NCMallRecord.includes(:item).all
existing_records_by_item_id = existing_records.to_h { |r| [r.item_id, r] }
# Additionally, check which of the item IDs in the live records are items
# we've seen before. (We'll skip records for items we don't know.)
live_item_ids = live_item_records.map { |r| r[:id] }
recognized_item_ids = Item.where(id: live_item_ids).pluck(:id).to_set
Rails.logger.debug "We found #{live_item_records.size} items, and we " +
"recognize #{recognized_item_ids.size} of them."
# Additionally, check which of the item IDs in the live records are items
# we've seen before. (We'll skip records for items we don't know.)
live_item_ids = live_item_records.map { |r| r[:id] }
recognized_item_ids = Item.where(id: live_item_ids).pluck(:id).to_set
Rails.logger.debug "We found #{live_item_records.size} items, and we " +
"recognize #{recognized_item_ids.size} of them."
# For each record in the live NC Mall, check if there's an existing record.
# If so, update it, and remove it from the existing records hash. If not,
# create it.
live_item_records.each do |record_data|
# If we don't recognize this item ID in our database already, skip it.
next unless recognized_item_ids.include?(record_data[:id])
# For each record in the live NC Mall, check if there's an existing record.
# If so, update it, and remove it from the existing records hash. If not,
# create it.
live_item_records.each do |record_data|
# If we don't recognize this item ID in our database already, skip it.
next unless recognized_item_ids.include?(record_data[:id])
record = existing_records_by_item_id.delete(record_data[:id]) ||
NCMallRecord.new
record.item_id = record_data[:id]
record.price = record_data[:price]
record.discount_price = record_data.dig(:discount, :price)
record.discount_begins_at = record_data.dig(:discount, :begins_at)
record.discount_ends_at = record_data.dig(:discount, :ends_at)
record = existing_records_by_item_id.delete(record_data[:id]) ||
NCMallRecord.new
record.item_id = record_data[:id]
record.price = record_data[:price]
record.discount_price = record_data.dig(:discount, :price)
record.discount_begins_at = record_data.dig(:discount, :begins_at)
record.discount_ends_at = record_data.dig(:discount, :ends_at)
if !record.changed?
Rails.logger.info "Skipping record for item #{record_data[:name]} " +
"(unchanged)"
next
end
if record.save
if record.previously_new_record?
Rails.logger.info "Created record for item #{record_data[:name]}"
else
Rails.logger.info "Updated record for item #{record_data[:name]}"
if !record.changed?
Rails.logger.info "Skipping record for item #{record_data[:name]} " +
"(unchanged)"
next
end
else
Rails.logger.error "Failed to save record for item " +
"#{record_data[:name]}: " +
"#{record.errors.full_messages.join("; ")}: " +
"#{record.inspect}"
end
end
# For each existing record remaining in the existing records hash, this
# means there was no live record corresponding to it during this sync.
# Delete it!
existing_records_by_item_id.values.each do |record|
item_name = record.item&.name || "<item not found>"
if record.destroy
Rails.logger.info "Destroyed record #{record.id} for item " +
"#{item_name}"
else
Rails.logger.error "Failed to destroy record #{record.id} for " +
"item #{item_name}: #{record.inspect}"
if record.save
if record.previously_new_record?
Rails.logger.info "Created record for item #{record_data[:name]}"
else
Rails.logger.info "Updated record for item #{record_data[:name]}"
end
else
Rails.logger.error "Failed to save record for item " +
"#{record_data[:name]}: " +
"#{record.errors.full_messages.join("; ")}: " +
"#{record.inspect}"
end
end
# For each existing record remaining in the existing records hash, this
# means there was no live record corresponding to it during this sync.
# Delete it!
existing_records_by_item_id.values.each do |record|
item_name = record.item&.name || "<item not found>"
if record.destroy
Rails.logger.info "Destroyed record #{record.id} for item " +
"#{item_name}"
else
Rails.logger.error "Failed to destroy record #{record.id} for " +
"item #{item_name}: #{record.inspect}"
end
end
rescue => e
Rails.logger.error "Failed to import NC Mall data: #{e.message}"
Rails.logger.error e.backtrace.join("\n")
Sentry.capture_exception(e, tags: { task: "neopets:import:nc_mall" })
raise
end
end
end
def load_all_nc_mall_pages
def load_all_nc_mall_items
Sync do
# First, start loading the homepage.
homepage_task = Async { Neopets::NCMall.load_home_page }
# Load all categories from the menu JSON
categories = Neopets::NCMall.load_categories
# Next, load the page links for different categories etc.
links = Neopets::NCMall.load_page_links
# Load all pages for each category, 10 categories at a time
category_item_tasks = DTIRequests.load_many(max_at_once: 10) do |task|
categories.map do |category|
task.async do
type = category["type"]
cat_id = category["cat_id"]
# Next, load the linked pages, 10 at a time.
linked_page_tasks = DTIRequests.load_many(max_at_once: 10) do |task|
links.map do |link|
task.async { Neopets::NCMall.load_page link[:type], link[:cat] }
Rails.logger.debug "Loading category: #{category["cat_name"]} " +
"(type=#{type}, cat=#{cat_id})"
Neopets::NCMall.load_category_all_pages(type, cat_id)
end
end
end
# Finally, return all the pages: the homepage, and the linked pages.
[homepage_task.wait] + linked_page_tasks.map(&:wait)
# Flatten all items from all categories and return as a single array
# (We'll de-duplicate in the main task)
category_item_tasks.map(&:wait).flatten
end
end

View file

@ -3,71 +3,81 @@ require "addressable/template"
namespace "neopets:import" do
desc "Import all basic image hashes from the Rainbow Pool, onto PetTypes"
task :rainbow_pool => ["neopets:import:neologin", :environment] do
puts "Importing from Rainbow Pool…"
begin
puts "Importing from Rainbow Pool…"
all_species = Species.order(:name).to_a
all_pet_types = PetType.all.to_a
all_pet_types_by_species_id_and_color_id = all_pet_types.
to_h { |pt| [[pt.species_id, pt.color_id], pt] }
all_colors_by_name = Color.all.to_h { |c| [c.human_name.downcase, c] }
all_species = Species.order(:name).to_a
all_pet_types = PetType.all.to_a
all_pet_types_by_species_id_and_color_id = all_pet_types.
to_h { |pt| [[pt.species_id, pt.color_id], pt] }
all_colors_by_name = Color.all.to_h { |c| [c.human_name.downcase, c] }
hashes_by_color_name_by_species_id = {}
DTIRequests.load_many(max_at_once: 10) do |task|
num_loaded = 0
num_total = all_species.size
print "0/#{num_total} species loaded"
hashes_by_color_name_by_species_id = {}
DTIRequests.load_many(max_at_once: 10) do |task|
num_loaded = 0
num_total = all_species.size
print "0/#{num_total} species loaded"
all_species.each do |species|
task.async do
begin
hashes_by_color_name_by_species_id[species.id] =
RainbowPool.load_hashes_for_species(species.id, Neologin.cookie)
rescue => error
puts "Failed to load #{species.name} page, skipping: #{error.message}"
Sentry.capture_exception(error,
tags: { task: "neopets:import:rainbow_pool" },
contexts: { species: { name: species.name, id: species.id } })
end
num_loaded += 1
print "\r#{num_loaded}/#{num_total} species loaded"
end
end
end
all_species.each do |species|
task.async do
begin
hashes_by_color_name_by_species_id[species.id] =
RainbowPool.load_hashes_for_species(species.id, Neologin.cookie)
rescue => error
puts "Failed to load #{species.name} page, skipping: #{error.message}"
hashes_by_color_name = hashes_by_color_name_by_species_id[species.id]
next if hashes_by_color_name.nil?
changed_pet_types = []
hashes_by_color_name.each do |color_name, image_hash|
color = all_colors_by_name[color_name.downcase]
if color.nil?
puts "Skipping unrecognized color name: #{color_name}"
next
end
pet_type = all_pet_types_by_species_id_and_color_id[
[species.id, color.id]]
if pet_type.nil?
puts "Skipping unrecognized pet type: " +
"#{color_name} #{species.human_name}"
next
end
if pet_type.basic_image_hash.nil?
puts "Found new image hash: #{image_hash} (#{pet_type.human_name})"
pet_type.basic_image_hash = image_hash
changed_pet_types << pet_type
elsif pet_type.basic_image_hash != image_hash
puts "Updating image hash: #{image_hash} ({#{pet_type.human_name})"
pet_type.basic_image_hash = image_hash
changed_pet_types << pet_type
else
# No need to do anything with image hashes that match!
end
num_loaded += 1
print "\r#{num_loaded}/#{num_total} species loaded"
end
PetType.transaction { changed_pet_types.each(&:save!) }
puts "Saved #{changed_pet_types.size} image hashes for " +
"#{species.human_name}"
end
end
all_species.each do |species|
hashes_by_color_name = hashes_by_color_name_by_species_id[species.id]
next if hashes_by_color_name.nil?
changed_pet_types = []
hashes_by_color_name.each do |color_name, image_hash|
color = all_colors_by_name[color_name.downcase]
if color.nil?
puts "Skipping unrecognized color name: #{color_name}"
next
end
pet_type = all_pet_types_by_species_id_and_color_id[
[species.id, color.id]]
if pet_type.nil?
puts "Skipping unrecognized pet type: " +
"#{color_name} #{species.human_name}"
next
end
if pet_type.basic_image_hash.nil?
puts "Found new image hash: #{image_hash} (#{pet_type.human_name})"
pet_type.basic_image_hash = image_hash
changed_pet_types << pet_type
elsif pet_type.basic_image_hash != image_hash
puts "Updating image hash: #{image_hash} ({#{pet_type.human_name})"
pet_type.basic_image_hash = image_hash
changed_pet_types << pet_type
else
# No need to do anything with image hashes that match!
end
end
PetType.transaction { changed_pet_types.each(&:save!) }
puts "Saved #{changed_pet_types.size} image hashes for " +
"#{species.human_name}"
rescue => e
puts "Failed to import Rainbow Pool data: #{e.message}"
puts e.backtrace.join("\n")
Sentry.capture_exception(e, tags: { task: "neopets:import:rainbow_pool" })
raise
end
end
end

View file

@ -1,99 +1,109 @@
namespace "neopets:import" do
desc "Import alt style info from the NC Styling Studio"
task :styling_studio => ["neopets:import:neologin", :environment] do
puts "Importing from Styling Studio…"
begin
puts "Importing from Styling Studio…"
all_species = Species.order(:name).to_a
all_species = Species.order(:name).to_a
# Load 10 species pages from the NC Mall at a time.
styles_by_species_id = {}
DTIRequests.load_many(max_at_once: 10) do |task|
num_loaded = 0
num_total = all_species.size
print "0/#{num_total} species loaded"
# Load 10 species pages from the NC Mall at a time.
styles_by_species_id = {}
DTIRequests.load_many(max_at_once: 10) do |task|
num_loaded = 0
num_total = all_species.size
print "0/#{num_total} species loaded"
all_species.each do |species|
task.async {
begin
styles_by_species_id[species.id] = Neopets::NCMall.load_styles(
species_id: species.id,
neologin: Neologin.cookie,
)
rescue => error
puts "\n⚠️ Error loading for #{species.human_name}, skipping: #{error.message}"
Sentry.capture_exception(error,
tags: { task: "neopets:import:styling_studio" },
contexts: { species: { name: species.human_name, id: species.id } })
end
num_loaded += 1
print "\r#{num_loaded}/#{num_total} species loaded"
}
end
end
print "\n"
style_ids = styles_by_species_id.values.flatten(1).map { |s| s[:oii] }
style_records_by_id =
AltStyle.where(id: style_ids).to_h { |as| [as.id, as] }
all_species.each do |species|
task.async {
begin
styles_by_species_id[species.id] = Neopets::NCMall.load_styles(
species_id: species.id,
neologin: Neologin.cookie,
)
rescue => error
puts "\n⚠️ Error loading for #{species.human_name}, skipping: #{error.message}"
styles = styles_by_species_id[species.id]
next if styles.nil?
counts = {changed: 0, unchanged: 0, skipped: 0}
styles.each do |style|
record = style_records_by_id[style[:oii]]
label = "#{style[:name]} (#{style[:oii]})"
if record.nil?
puts "❔ [#{label}]: Not modeled yet, skipping"
counts[:skipped] += 1
next
end
num_loaded += 1
print "\r#{num_loaded}/#{num_total} species loaded"
}
end
end
print "\n"
style_ids = styles_by_species_id.values.flatten(1).map { |s| s[:oii] }
style_records_by_id =
AltStyle.where(id: style_ids).to_h { |as| [as.id, as] }
if !record.real_full_name?
record.full_name = style[:name]
puts "✅ [#{label}]: Full name is now #{style[:name].inspect}"
elsif record.full_name != style[:name]
puts "⚠️ [#{label}: Full name may have changed, handle manually? " +
"#{record.full_name.inspect} -> #{style[:name].inspect}"
end
all_species.each do |species|
styles = styles_by_species_id[species.id]
next if styles.nil?
if !record.real_thumbnail_url?
record.thumbnail_url = style[:image]
puts "✅ [#{label}]: Thumbnail URL is now #{style[:image].inspect}"
elsif record.thumbnail_url != style[:image]
puts "⚠️ [#{label}: Thumbnail URL may have changed, handle manually? " +
"#{record.thumbnail_url.inspect} -> #{style[:image].inspect}"
end
counts = {changed: 0, unchanged: 0, skipped: 0}
styles.each do |style|
record = style_records_by_id[style[:oii]]
label = "#{style[:name]} (#{style[:oii]})"
if record.nil?
puts "❔ [#{label}]: Not modeled yet, skipping"
counts[:skipped] += 1
next
end
if !record.real_full_name?
record.full_name = style[:name]
puts "✅ [#{label}]: Full name is now #{style[:name].inspect}"
elsif record.full_name != style[:name]
puts "⚠️ [#{label}: Full name may have changed, handle manually? " +
"#{record.full_name.inspect} -> #{style[:name].inspect}"
end
if !record.real_thumbnail_url?
record.thumbnail_url = style[:image]
puts "✅ [#{label}]: Thumbnail URL is now #{style[:image].inspect}"
elsif record.thumbnail_url != style[:image]
puts "⚠️ [#{label}: Thumbnail URL may have changed, handle manually? " +
"#{record.thumbnail_url.inspect} -> #{style[:image].inspect}"
end
if style[:name].end_with?(record.pet_name)
new_series_name = style[:name].split(record.pet_name).first.strip
if !record.real_series_name?
record.series_name = new_series_name
puts "✅ [#{label}]: Series name is now #{new_series_name.inspect}"
elsif record.series_name != new_series_name
if ENV['FORCE'] == '1'
puts "❗ [#{label}]: Series name forcibly changed: " +
"#{record.series_name.inspect} -> #{new_series_name.inspect}"
if style[:name].end_with?(record.pet_name)
new_series_name = style[:name].split(record.pet_name).first.strip
if !record.real_series_name?
record.series_name = new_series_name
else
puts "⚠️ [#{label}]: Series name may have changed, handle manually? " +
"#{record.series_name.inspect} -> #{new_series_name.inspect}"
puts "✅ [#{label}]: Series name is now #{new_series_name.inspect}"
elsif record.series_name != new_series_name
if ENV['FORCE'] == '1'
puts "❗ [#{label}]: Series name forcibly changed: " +
"#{record.series_name.inspect} -> #{new_series_name.inspect}"
record.series_name = new_series_name
else
puts "⚠️ [#{label}]: Series name may have changed, handle manually? " +
"#{record.series_name.inspect} -> #{new_series_name.inspect}"
end
end
else
puts "⚠️ [#{label}]: Unable to detect series name, handle manually? " +
"#{record.pet_name.inspect} <-> #{style[:name].inspect}"
end
else
puts "⚠️ [#{label}]: Unable to detect series name, handle manually? " +
"#{record.pet_name.inspect} <-> #{style[:name].inspect}"
if record.changed?
counts[:changed] += 1
else
counts[:unchanged] += 1
end
record.save!
end
if record.changed?
counts[:changed] += 1
else
counts[:unchanged] += 1
end
record.save!
puts "#{species.human_name}: #{counts[:changed]} changed, " +
"#{counts[:unchanged]} unchanged, #{counts[:skipped]} skipped"
end
puts "#{species.human_name}: #{counts[:changed]} changed, " +
"#{counts[:unchanged]} unchanged, #{counts[:skipped]} skipped"
rescue => e
puts "Failed to import Styling Studio data: #{e.message}"
puts e.backtrace.join("\n")
Sentry.capture_exception(e, tags: { task: "neopets:import:styling_studio" })
raise
end
end
end

View file

@ -551,6 +551,17 @@ RSpec.describe Pet, type: :model do
it("has no thumbnail yet") { expect(alt_style.thumbnail_url?).to be false }
it("is saved when saving the pet") { pet.save!; should be_persisted }
it "logs creation of new alt style" do
expect(Rails.logger).to receive(:info).with(
a_string_matching(/\[Alt Style Modeling\] Created alt style ID=87458 for pet=Majal_Kita/)
.and(matching(/species_id=20/))
.and(matching(/color_id=62/))
.and(matching(/body_id=378/))
.and(matching(/asset_ids=\[56223\]/))
)
pet.alt_style
end
describe "its assets" do
subject(:assets) { alt_style.swf_assets }
let(:asset_ids) { assets.map(&:remote_id) }
@ -588,6 +599,13 @@ RSpec.describe Pet, type: :model do
new_pet.save!; expect(alt_style.previous_changes).to be_empty
end
it "logs re-modeling without changes" do
expect(Rails.logger).to receive(:info).with(
a_string_matching(/\[Alt Style Modeling\] Loaded alt style ID=87458 for pet=Majal_Kita \(no changes\)/)
)
new_pet.alt_style
end
describe "its assets" do
subject(:assets) { alt_style.swf_assets }
@ -599,6 +617,37 @@ RSpec.describe Pet, type: :model do
end
end
end
context "when an alt style with the same ID but different attributes already exists" do
before do
Pet.load("Blue_Jetsam").save!
# Create an alt style with ID 87458 but different attributes
# (simulating Neopets reusing an ID)
wrong_color = Color.find_by_name!("Blue")
wrong_species = Species.find_by_name!("Acara")
AltStyle.create!(
id: 87458,
color_id: wrong_color.id,
species_id: wrong_species.id,
body_id: 999,
thumbnail_url: "http://example.com/wrong.png"
)
end
subject(:pet) { Pet.load("Majal_Kita") }
it "logs a warning about updating existing alt style" do
expect(Rails.logger).to receive(:warn).with(
a_string_matching(/\[Alt Style Modeling\] Updated alt style ID=87458 for pet=Majal_Kita/)
.and(matching(/CHANGED:/))
.and(matching(/species_id: 1 -> 20/))
.and(matching(/color_id: 8 -> 62/))
.and(matching(/body_id: 999 -> 378/))
.and(matching(/asset_ids: \[\] -> \[56223\]/))
)
pet.alt_style
end
end
end
end

View file

@ -3,8 +3,8 @@ require_relative '../rails_helper'
RSpec.describe Neopets::NCMall, type: :model do
describe ".load_page" do
def stub_page_request
stub_request(:get, "https://ncmall.neopets.com/mall/ajax/load_page.phtml?type=new&cat=52&lang=en").
def stub_v2_page_request(page: 1)
stub_request(:get, "https://ncmall.neopets.com/mall/ajax/v2/category/index.phtml?type=new_items&cat=52&page=#{page}&limit=24").
with(
headers: {
"User-Agent": Rails.configuration.user_agent_for_neopets,
@ -13,12 +13,12 @@ RSpec.describe Neopets::NCMall, type: :model do
end
subject(:page) do
Neopets::NCMall.load_page("new", 52)
Neopets::NCMall.load_page("new_items", 52, page: 1, limit: 24)
end
it "loads a page from the NC Mall" do
stub_page_request.to_return(
body: '{"html":"","render_html":"0","render":[82936,90226],"object_data":{"82936":{"id":82936,"name":"+1 Extra Pet Slot","description":"Just ONE more Neopet... just ONE more...! This pack includes 1 extra pet slot. Each extra pet slot can be used to create a new pet, adopt a pet, or bring back any idle pets lost from non-premium accounts.","price":500,"discountPrice":0,"atPurchaseDiscountPrice":null,"discountBegin":1735372800,"discountEnd":1735718399,"uses":1,"isSuperpack":0,"isBundle":0,"packContents":null,"isAvailable":1,"imageFile":"mall_petslots_1","saleBegin":1703094300,"saleEnd":0,"duration":0,"isSoldOut":0,"isNeohome":0,"isWearable":0,"isBuyable":1,"isAlbumTheme":0,"isGiftbox":0,"isInRandomWindow":null,"isElite":0,"isCollectible":0,"isKeyquest":0,"categories":null,"isHabitarium":0,"isNoInvInsert":1,"isLimitedQuantity":0,"isPresale":0,"isGambling":0,"petSlotPack":1,"maxPetSlots":10,"currentUserBoughtPetSlots":0,"formatted":{"name":"+1 Extra Pet Slot","ck":false,"price":"500","discountPrice":"0","limited":false},"converted":true},"90226":{"id":90226,"name":"Weekend Sales 2025 Mega Gram","description":"Lets go shopping! Purchase this Weekend Sales Mega Gram and choose from exclusive Weekend Sales items to send to a Neofriend, no gift box needed! This gram also has a chance of including a Limited Edition NC item. Please visit the NC Mall FAQs for more information on this item.","price":250,"discountPrice":125,"atPurchaseDiscountPrice":null,"discountBegin":1737136800,"discountEnd":1737446399,"uses":1,"isSuperpack":0,"isBundle":0,"packContents":null,"isAvailable":1,"imageFile":"42embjc204","saleBegin":1737136800,"saleEnd":1739865599,"duration":0,"isSoldOut":0,"isNeohome":0,"isWearable":0,"isBuyable":1,"isAlbumTheme":0,"isGiftbox":0,"isInRandomWindow":null,"isElite":0,"isCollectible":0,"isKeyquest":0,"categories":null,"isHabitarium":0,"isNoInvInsert":0,"isLimitedQuantity":0,"isPresale":0,"isGambling":0,"formatted":{"name":"Weekend Sales 2025 Mega Gram","ck":false,"price":"250","discountPrice":"125","limited":false},"converted":true}},"response":{"category":"52","type":"new","image":{"location":"//images.neopets.com/items/","star_location":"//images.neopets.com/ncmall/","extension":".gif","stars":{"blue":"star_blue","red":"star_red","orange":"star_orange","leso":"leso_star"}},"heading":"New","no_items_msg":"","shopkeeper":{"img":"//images.neopets.com/ncmall/shopkeepers/mall_new.jpg","title":"Style is all about what\'s new… good thing that\'s all I stock!","message":"Come browse my shop and find the latest and greatest the NC Mall has to offer!","new_format":true},"strings":{"claim_it":"Claim it","none_left":"Sorry, there are none left!","nc":"NC","free":"FREE","add_to_cart":"Add to cart"}}}'
it "loads a page from the v2 NC Mall API" do
stub_v2_page_request.to_return(
body: '{"html":"","render_html":"0","type":"new_items","data":[{"id":82936,"name":"+1 Extra Pet Slot","description":"Just ONE more Neopet... just ONE more...! This pack includes 1 extra pet slot. Each extra pet slot can be used to create a new pet, adopt a pet, or bring back any idle pets lost from non-premium accounts.","price":500,"discountPrice":0,"atPurchaseDiscountPrice":null,"discountBegin":1735372800,"discountEnd":1735718399,"uses":1,"isSuperpack":0,"isBundle":0,"packContents":null,"isAvailable":1,"imageFile":"mall_petslots_1","saleBegin":1703094300,"saleEnd":0,"duration":0,"isSoldOut":0,"isNeohome":0,"isWearable":0,"isBuyable":1,"isAlbumTheme":0,"isGiftbox":0,"isInRandomWindow":null,"isElite":0,"isCollectible":0,"isKeyquest":0,"categories":null,"isHabitarium":0,"isNoInvInsert":1,"isLimitedQuantity":0,"isPresale":0,"isGambling":0,"petSlotPack":1,"maxPetSlots":10,"currentUserBoughtPetSlots":0,"formatted":{"name":"+1 Extra Pet Slot","ck":false,"price":"500","discountPrice":"0","limited":false},"converted":true},{"id":90226,"name":"Weekend Sales 2025 Mega Gram","description":"Lets go shopping! Purchase this Weekend Sales Mega Gram and choose from exclusive Weekend Sales items to send to a Neofriend, no gift box needed! This gram also has a chance of including a Limited Edition NC item. Please visit the NC Mall FAQs for more information on this item.","price":250,"discountPrice":125,"atPurchaseDiscountPrice":null,"discountBegin":1737136800,"discountEnd":1737446399,"uses":1,"isSuperpack":0,"isBundle":0,"packContents":null,"isAvailable":1,"imageFile":"42embjc204","saleBegin":1737136800,"saleEnd":1739865599,"duration":0,"isSoldOut":0,"isNeohome":0,"isWearable":0,"isBuyable":1,"isAlbumTheme":0,"isGiftbox":0,"isInRandomWindow":null,"isElite":0,"isCollectible":0,"isKeyquest":0,"categories":null,"isHabitarium":0,"isNoInvInsert":0,"isLimitedQuantity":0,"isPresale":0,"isGambling":0,"formatted":{"name":"Weekend Sales 2025 Mega Gram","ck":false,"price":"250","discountPrice":"125","limited":false},"converted":true}],"totalItems":"2","totalPages":"1","page":"1","limit":"24"}'
)
expect(page[:items]).to contain_exactly(
@ -45,6 +45,65 @@ RSpec.describe Neopets::NCMall, type: :model do
is_available: true,
},
)
expect(page[:total_pages]).to eq(1)
expect(page[:page]).to eq(1)
end
it "handles pagination metadata" do
stub_v2_page_request.to_return(
body: '{"html":"","render_html":"0","type":"new_items","data":[{"id":82936,"name":"Test Item","description":"Test","price":100,"discountPrice":0,"atPurchaseDiscountPrice":null,"discountBegin":1735372800,"discountEnd":1735718399,"uses":1,"isSuperpack":0,"isBundle":0,"packContents":null,"isAvailable":1,"imageFile":"test","saleBegin":1703094300,"saleEnd":0,"duration":0,"isSoldOut":0,"isNeohome":0,"isWearable":1,"isBuyable":1,"isAlbumTheme":0,"isGiftbox":0,"isInRandomWindow":null,"isElite":0,"isCollectible":0,"isKeyquest":0,"categories":null,"isHabitarium":0,"isNoInvInsert":0,"isLimitedQuantity":0,"isPresale":0,"isGambling":0,"formatted":{"name":"Test Item","ck":false,"price":"100","discountPrice":"0","limited":false},"converted":true}],"totalItems":"50","totalPages":"3","page":"1","limit":"24"}'
)
expect(page[:total_pages]).to eq(3)
expect(page[:page]).to eq(1)
expect(page[:limit]).to eq(24)
end
end
describe ".load_categories" do
def stub_homepage_request
stub_request(:get, "https://ncmall.neopets.com/mall/shop.phtml").
with(
headers: {
"User-Agent": Rails.configuration.user_agent_for_neopets,
},
)
end
subject(:categories) do
Neopets::NCMall.load_categories
end
it "extracts browsable categories from menu JSON and maps load types" do
stub_homepage_request.to_return(
body: '<html><head><script>window.ncmall_menu = [{"cat_id":52,"cat_name":"New","load_type":"new"},{"cat_id":54,"cat_name":"Popular","load_type":"popular"},{"cat_id":42,"cat_name":"Customization","load_type":"neopet","children":[{"cat_id":43,"cat_name":"Clothing","parent_id":42},{"cat_id":44,"cat_name":"Shoes","parent_id":42}]},{"cat_name":"Specialty","children":[{"cat_id":85,"cat_name":"NC Collectible","load_type":"collectible","url":"https://www.neopets.com/mall/nc_collectible_case.phtml"},{"cat_id":13,"cat_name":"Elite Boutique","url":"https://ncmall.neopets.com/mall/shop.phtml?page=&cat=13"}]}];</script></head></html>'
)
expect(categories).to contain_exactly(
hash_including("cat_id" => 52, "cat_name" => "New", "type" => "new_items"),
hash_including("cat_id" => 54, "cat_name" => "Popular", "type" => "popular_items"),
hash_including("cat_id" => 42, "cat_name" => "Customization", "type" => "browse"),
hash_including("cat_id" => 43, "cat_name" => "Clothing", "parent_id" => 42, "type" => "browse"),
hash_including("cat_id" => 44, "cat_name" => "Shoes", "parent_id" => 42, "type" => "browse"),
)
# Should NOT include load_type field (it's been converted to type)
categories.each do |cat|
expect(cat).not_to have_key("load_type")
end
# Should NOT include categories with external URLs
expect(categories).not_to include(
hash_including("cat_name" => "NC Collectible"),
)
expect(categories).not_to include(
hash_including("cat_name" => "Elite Boutique"),
)
# Should NOT include structural parent without cat_id
expect(categories).not_to include(
hash_including("cat_name" => "Specialty"),
)
end
end

Binary file not shown.

BIN
vendor/cache/ffi-1.17.2-arm64-darwin.gem vendored Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

75
vendor/gems/README-RocketAMF.md vendored Normal file
View file

@ -0,0 +1,75 @@
# RocketAMF Vendored Gem
_Fix and docs authored by Claude Code, with Matchu's supervision. I'm not super familiar with C extensions in Ruby, but the edit seems small and safe, and pet loading still works as expected!_
This directory contains a vendored, patched version of RocketAMF 1.0.0.
## Why Vendored?
RocketAMF is a critical dependency for DTI's "modeling" system - it enables communication with Neopets.com's legacy Flash/AMF (Action Message Format) API to fetch pet appearance data. However, the upstream gem has not been maintained for modern Ruby versions.
## What Was Changed?
**File modified**: `ext/rocketamf_ext/class_mapping.c`
**Problem**: Ruby 3.4 introduced stricter type checking for `st_foreach` callback functions. The original code used incorrect function pointer types that were accepted in older Ruby versions but rejected in Ruby 3.4+.
**Fix**: Updated the `mapping_populate_iter` callback function signature (line 340) to match Ruby 3.4's requirements:
```c
// BEFORE (Ruby < 3.4):
static int mapping_populate_iter(VALUE key, VALUE val, const VALUE args[2])
// AFTER (Ruby 3.4+):
static int mapping_populate_iter(st_data_t key_data, st_data_t val_data, st_data_t args_data) {
VALUE key = (VALUE)key_data;
VALUE val = (VALUE)val_data;
const VALUE *args = (const VALUE *)args_data;
// ... rest of function unchanged
}
```
The function body remains identical - we just cast the `st_data_t` parameters to the expected `VALUE` types at the start of the function.
## Upstream Status
**Repository**: https://github.com/rubyamf/rocketamf
**Last commit**: 2018
**Issue**: No active maintenance; Ruby 3.4 compatibility not addressed upstream
We chose to vendor this fix rather than maintain a full fork because:
1. RocketAMF functionality is stable - we don't expect to need updates
2. The community demand is low (Neopets' Flash API is legacy)
3. A vendored gem is simpler to maintain than a hosted fork
## Testing
After applying the fix, verified:
- ✅ Gem compiles successfully on ARM (aarch64-linux) with Ruby 3.4.5
- ✅ Gem loads without errors: `require 'rocketamf'`
- ✅ C extension works: `RocketAMF::ClassMapping.new`
- ✅ End-to-end Neopets API integration functional
## Updating This Gem
If you need to update RocketAMF in the future:
1. Clone the upstream repo: `git clone https://github.com/rubyamf/rocketamf.git`
2. Apply the fix to `ext/rocketamf_ext/class_mapping.c` (see above)
3. Build the gem: `gem build RocketAMF.gemspec`
4. Unpack to vendor: `gem unpack RocketAMF-X.X.X.gem`
5. Update the Gemfile path if version changed
6. Test thoroughly with `bundle install` and Neopets modeling functionality
## Alternative Solutions Considered
- **Fork to GitHub**: Too much maintenance overhead for a single file change
- **Downgrade Ruby**: Would miss out on Ruby 3.4+ features and security updates
- **Pure-Ruby AMF library**: None exist with active maintenance
- **Patch at runtime**: C extension issues can't be patched from Ruby
---
**Fix applied**: 2025-10-30
**Ruby version**: 3.4.5
**Architecture**: ARM64 (aarch64-linux)

47
vendor/gems/RocketAMF-1.0.0/README.rdoc vendored Normal file
View file

@ -0,0 +1,47 @@
== DESCRIPTION:
RocketAMF is a full featured AMF0/3 serializer and deserializer with support for
bi-directional Flash to Ruby class mapping, custom serialization and mapping,
remoting gateway helpers that follow AMF0/3 messaging specs, and a suite of specs
to ensure adherence to the specification documents put out by Adobe. If the C
components compile, then RocketAMF automatically takes advantage of them to
provide a substantial performance benefit. In addition, RocketAMF is fully
compatible with Ruby 1.9.
== INSTALL:
gem install RocketAMF
== SIMPLE EXAMPLE:
require 'rocketamf'
hash = {:apple => "Apfel", :red => "Rot", :eyes => "Augen"}
File.open("amf.dat", 'w') do |f|
f.write RocketAMF.serialize(hash, 3) # Use AMF3 encoding to serialize
end
== LICENSE:
(The MIT License)
Copyright (c) 2011 Stephen Augenstein and Jacob Henry
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

59
vendor/gems/RocketAMF-1.0.0/Rakefile vendored Normal file
View file

@ -0,0 +1,59 @@
require 'rubygems'
require 'rake'
require 'rake/rdoctask'
require 'rake/gempackagetask'
require 'rspec/core/rake_task'
require 'rake/extensiontask'
desc 'Default: run the specs.'
task :default => :spec
# I don't want to depend on bundler, so we do it the bundler way without it
gemspec_path = 'RocketAMF.gemspec'
spec = begin
eval(File.read(File.join(File.dirname(__FILE__), gemspec_path)), TOPLEVEL_BINDING, gemspec_path)
rescue LoadError => e
original_line = e.backtrace.find { |line| line.include?(gemspec_path) }
msg = "There was a LoadError while evaluating #{gemspec_path}:\n #{e.message}"
msg << " from\n #{original_line}" if original_line
msg << "\n"
puts msg
exit
end
RSpec::Core::RakeTask.new do |t|
end
desc 'Generate documentation'
Rake::RDocTask.new(:rdoc) do |rdoc|
rdoc.rdoc_dir = 'rdoc'
rdoc.title = spec.name
rdoc.options += spec.rdoc_options
rdoc.rdoc_files.include(*spec.extra_rdoc_files)
rdoc.rdoc_files.include("lib") # Don't include ext folder because no one cares
end
Rake::GemPackageTask.new(spec) do |pkg|
pkg.need_zip = false
pkg.need_tar = false
end
Rake::ExtensionTask.new('rocketamf_ext', spec) do |ext|
if RUBY_PLATFORM =~ /mswin|mingw/ then
# No cross-compile on win, so compile extension to lib/1.[89]
RUBY_VERSION =~ /(\d+\.\d+)/
ext.lib_dir = "lib/#{$1}"
else
ext.cross_compile = true
ext.cross_platform = 'x86-mingw32'
ext.cross_compiling do |gem_spec|
gem_spec.post_install_message = "You installed the binary version of this gem!"
end
end
#ext.config_options << '--enable-sort-props'
end
desc "Build gem packages"
task :gems do
sh "rake cross native gem RUBY_CC_VERSION=1.8.7:1.9.2"
end

View file

@ -0,0 +1,20 @@
# -*- encoding: utf-8 -*-
Gem::Specification.new do |s|
s.name = 'RocketAMF'
s.version = '1.0.0.dti1'
s.platform = Gem::Platform::RUBY
s.authors = ['Jacob Henry', 'Stephen Augenstein', "Joc O'Connor"]
s.email = ['perl.programmer@gmail.com']
s.homepage = 'http://github.com/rubyamf/rocketamf'
s.summary = 'Fast AMF serializer/deserializer with remoting request/response wrappers to simplify integration'
s.files = Dir[*['README.rdoc', 'benchmark.rb', 'RocketAMF.gemspec', 'Rakefile', 'lib/**/*.rb', 'spec/**/*.{rb,bin,opts}', 'ext/**/*.{c,h,rb}']]
s.test_files = Dir[*['spec/**/*_spec.rb']]
s.extensions = Dir[*["ext/**/extconf.rb"]]
s.require_paths = ["lib"]
s.has_rdoc = true
s.extra_rdoc_files = ['README.rdoc']
s.rdoc_options = ['--line-numbers', '--main', 'README.rdoc']
end

View file

@ -0,0 +1,74 @@
$:.unshift(File.dirname(__FILE__) + '/ext')
$:.unshift(File.dirname(__FILE__) + '/lib')
require 'rubygems'
require 'rocketamf'
require 'rocketamf/pure/deserializer' # Only ext gets included by default if available
require 'rocketamf/pure/serializer'
OBJECT_COUNT = 100000
TESTS = 5
class TestClass
attr_accessor :prop_a, :prop_b, :prop_c, :prop_d, :prop_e
def populate some_arg=nil # Make sure class mapper doesn't think populate is a property
@@count ||= 1
@prop_a = "asdfasdf #{@@count}"
@prop_b = "simple string"
@prop_c = 3120094.03
@prop_d = Time.now
@prop_e = 3120094
@@count += 1
self
end
end
objs = []
OBJECT_COUNT.times do
objs << TestClass.new.populate
end
["native", "pure"].each do |type|
# Set up class mapper
cm = if type == "pure"
RocketAMF::ClassMapping
else
RocketAMF::Ext::FastClassMapping
end
cm.define do |m|
m.map :as => 'TestClass', :ruby => 'TestClass'
end
[0, 3].each do |version|
# 2**24 is larger than anyone is ever going to run this for
min_serialize = 2**24
min_deserialize = 2**24
puts "Testing #{type} AMF#{version}:"
TESTS.times do
ser = if type == "pure"
RocketAMF::Pure::Serializer.new(cm.new)
else
RocketAMF::Ext::Serializer.new(cm.new)
end
start_time = Time.now
out = ser.serialize(version, objs)
end_time = Time.now
puts "\tserialize run: #{end_time-start_time}s"
min_serialize = [end_time-start_time, min_serialize].min
des = if type == "pure"
RocketAMF::Pure::Deserializer.new(cm.new)
else
RocketAMF::Ext::Deserializer.new(cm.new)
end
start_time = Time.now
temp = des.deserialize(version, out)
end_time = Time.now
puts "\tdeserialize run: #{end_time-start_time}s"
min_deserialize = [end_time-start_time, min_deserialize].min
end
puts "\tminimum serialize time: #{min_serialize}s"
puts "\tminimum deserialize time: #{min_deserialize}s"
end
end

View file

@ -0,0 +1,487 @@
#include <ruby.h>
#ifdef HAVE_RB_STR_ENCODE
#include <ruby/st.h>
#else
#include <st.h>
#endif
#include "utility.h"
extern VALUE mRocketAMF;
extern VALUE mRocketAMFExt;
VALUE cFastMappingSet;
VALUE cTypedHash;
ID id_use_ac;
ID id_use_ac_ivar;
ID id_mappings;
ID id_mappings_ivar;
ID id_hashset;
typedef struct {
VALUE mapset;
st_table* setter_cache;
st_table* prop_cache;
} CLASS_MAPPING;
typedef struct {
st_table* as_mappings;
st_table* rb_mappings;
} MAPSET;
/*
* Mark the as_mappings and rb_mappings hashes
*/
static void mapset_mark(MAPSET *set) {
if(!set) return;
rb_mark_tbl(set->as_mappings);
rb_mark_tbl(set->rb_mappings);
}
/*
* Free the mapping tables and struct
*/
int mapset_free_strtable_key(st_data_t key, st_data_t value, st_data_t ignored) {
xfree((void *)key);
return ST_DELETE;
}
static void mapset_free(MAPSET *set) {
st_foreach(set->as_mappings, mapset_free_strtable_key, 0);
st_free_table(set->as_mappings);
set->as_mappings = NULL;
st_foreach(set->rb_mappings, mapset_free_strtable_key, 0);
st_free_table(set->rb_mappings);
set->rb_mappings = NULL;
xfree(set);
}
/*
* Allocate mapset and populate mappings with built-in mappings
*/
static VALUE mapset_alloc(VALUE klass) {
MAPSET *set = ALLOC(MAPSET);
memset(set, 0, sizeof(MAPSET));
VALUE self = Data_Wrap_Struct(klass, mapset_mark, mapset_free, set);
// Initialize internal data
set->as_mappings = st_init_strtable();
set->rb_mappings = st_init_strtable();
return self;
}
/*
* call-seq:
* RocketAMF::Ext::MappingSet.new
*
* Creates a mapping set object and populates the default mappings
*/
static VALUE mapset_init(VALUE self) {
rb_funcall(self, rb_intern("map_defaults"), 0);
return self;
}
/*
* call-seq:
* m.map_defaults
*
* Adds required mapping configs, calling map for the required base mappings
*/
static VALUE mapset_map_defaults(VALUE self) {
const int NUM_MAPPINGS = 9;
const char* ruby_classes[] = {
"RocketAMF::Values::AbstractMessage",
"RocketAMF::Values::RemotingMessage",
"RocketAMF::Values::AsyncMessage",
"RocketAMF::Values::AsyncMessageExt",
"RocketAMF::Values::CommandMessage",
"RocketAMF::Values::CommandMessageExt",
"RocketAMF::Values::AcknowledgeMessage",
"RocketAMF::Values::AcknowledgeMessageExt",
"RocketAMF::Values::ErrorMessage"
};
const char* as_classes[] = {
"flex.messaging.messages.AbstractMessage",
"flex.messaging.messages.RemotingMessage",
"flex.messaging.messages.AsyncMessage",
"DSA",
"flex.messaging.messages.CommandMessage",
"DSC",
"flex.messaging.messages.AcknowledgeMessage",
"DSK",
"flex.messaging.messages.ErrorMessage"
};
int i;
ID map_id = rb_intern("map");
VALUE params = rb_hash_new();
VALUE as_sym = ID2SYM(rb_intern("as"));
VALUE ruby_sym = ID2SYM(rb_intern("ruby"));
for(i = 0; i < NUM_MAPPINGS; i++) {
rb_hash_aset(params, as_sym, rb_str_new2(as_classes[i]));
rb_hash_aset(params, ruby_sym, rb_str_new2(ruby_classes[i]));
rb_funcall(self, map_id, 1, params);
}
return self;
}
/*
* call-seq:
* m.map :as => 'com.example.Date', :ruby => "Example::Date'
*
* Map a given AS class to a ruby class. Use fully qualified names for both.
*/
static VALUE mapset_map(VALUE self, VALUE mapping) {
MAPSET *set;
Data_Get_Struct(self, MAPSET, set);
VALUE as_class = rb_hash_aref(mapping, ID2SYM(rb_intern("as")));
VALUE rb_class = rb_hash_aref(mapping, ID2SYM(rb_intern("ruby")));
st_insert(set->as_mappings, (st_data_t)strdup(RSTRING_PTR(as_class)), rb_class);
st_insert(set->rb_mappings, (st_data_t)strdup(RSTRING_PTR(rb_class)), as_class);
return Qnil;
}
/*
* Internal method for looking up a given ruby class's AS class name or Qnil if
* not found
*/
static VALUE mapset_as_lookup(VALUE self, const char* class_name) {
MAPSET *set;
Data_Get_Struct(self, MAPSET, set);
VALUE as_name;
if(st_lookup(set->rb_mappings, (st_data_t)class_name, &as_name)) {
return as_name;
} else {
return Qnil;
}
}
/*
* Internal method for looking up a given AS class names ruby class name mapping
* or Qnil if not found
*/
static VALUE mapset_rb_lookup(VALUE self, const char* class_name) {
MAPSET *set;
Data_Get_Struct(self, MAPSET, set);
VALUE rb_name;
if(st_lookup(set->as_mappings, (st_data_t)class_name, &rb_name)) {
return rb_name;
} else {
return Qnil;
}
}
/*
* Mark the mapset object and property lookup cache
*/
static void mapping_mark(CLASS_MAPPING *map) {
if(!map) return;
rb_gc_mark(map->mapset);
rb_mark_tbl(map->prop_cache);
}
/*
* Free prop cache table and struct
*/
static void mapping_free(CLASS_MAPPING *map) {
st_free_table(map->setter_cache);
st_free_table(map->prop_cache);
xfree(map);
}
/*
* Allocate class mapping struct
*/
static VALUE mapping_alloc(VALUE klass) {
CLASS_MAPPING *map = ALLOC(CLASS_MAPPING);
memset(map, 0, sizeof(CLASS_MAPPING));
VALUE self = Data_Wrap_Struct(klass, mapping_mark, mapping_free, map);
map->setter_cache = st_init_numtable();
map->prop_cache = st_init_numtable();
return self;
}
/*
* Class-level getter for use_array_collection
*/
static VALUE mapping_s_array_collection_get(VALUE klass) {
VALUE use_ac = rb_ivar_get(klass, id_use_ac_ivar);
if(use_ac == Qnil) {
use_ac = Qfalse;
rb_ivar_set(klass, id_use_ac_ivar, use_ac);
}
return use_ac;
}
/*
* Class-level setter for use_array_collection
*/
static VALUE mapping_s_array_collection_set(VALUE klass, VALUE use_ac) {
return rb_ivar_set(klass, id_use_ac_ivar, use_ac);
}
/*
* Return MappingSet for class mapper, creating if uninitialized
*/
static VALUE mapping_s_mappings(VALUE klass) {
VALUE mappings = rb_ivar_get(klass, id_mappings_ivar);
if(mappings == Qnil) {
mappings = rb_class_new_instance(0, NULL, cFastMappingSet);
rb_ivar_set(klass, id_mappings_ivar, mappings);
}
return mappings;
}
/*
* call-seq:
* mapper.define {|m| block } => nil
*
* Define class mappings in the block. Block is passed a MappingSet object as
* the first parameter. See RocketAMF::ClassMapping for details.
*/
static VALUE mapping_s_define(VALUE klass) {
if (rb_block_given_p()) {
VALUE mappings = rb_funcall(klass, id_mappings, 0);
rb_yield(mappings);
}
return Qnil;
}
/*
* Reset class mappings
*/
static VALUE mapping_s_reset(VALUE klass) {
rb_ivar_set(klass, id_use_ac_ivar, Qfalse);
rb_ivar_set(klass, id_mappings_ivar, Qnil);
return Qnil;
}
/*
* Initialize class mapping object, setting use_class_mapping to false
*/
static VALUE mapping_init(VALUE self) {
CLASS_MAPPING *map;
Data_Get_Struct(self, CLASS_MAPPING, map);
map->mapset = rb_funcall(CLASS_OF(self), id_mappings, 0);
VALUE use_ac = rb_funcall(CLASS_OF(self), id_use_ac, 0);
rb_ivar_set(self, id_use_ac_ivar, use_ac);
return self;
}
/*
* call-seq:
* mapper.get_as_class_name => str
*
* Returns the AS class name for the given ruby object. Will also take a string
* containing the ruby class name.
*/
static VALUE mapping_as_class_name(VALUE self, VALUE obj) {
CLASS_MAPPING *map;
Data_Get_Struct(self, CLASS_MAPPING, map);
int type = TYPE(obj);
const char* class_name;
if(type == T_STRING) {
// Use strings as the class name
class_name = RSTRING_PTR(obj);
} else {
// Look up the class name and use that
VALUE klass = CLASS_OF(obj);
class_name = rb_class2name(klass);
if(klass == cTypedHash) {
VALUE orig_name = rb_funcall(obj, rb_intern("type"), 0);
class_name = RSTRING_PTR(orig_name);
} else if(type == T_HASH) {
// Don't bother looking up hash mapping, but need to check class name first in case it's a typed hash
return Qnil;
}
}
return mapset_as_lookup(map->mapset, class_name);
}
/*
* call_seq:
* mapper.get_ruby_obj => obj
*
* Instantiates a ruby object using the mapping configuration based on the
* source AS class name. If there is no mapping defined, it returns a
* <tt>RocketAMF::Values::TypedHash</tt> with the serialized class name.
*/
static VALUE mapping_get_ruby_obj(VALUE self, VALUE name) {
CLASS_MAPPING *map;
Data_Get_Struct(self, CLASS_MAPPING, map);
VALUE argv[1];
VALUE ruby_class_name = mapset_rb_lookup(map->mapset, RSTRING_PTR(name));
if(ruby_class_name == Qnil) {
argv[0] = name;
return rb_class_new_instance(1, argv, cTypedHash);
} else {
VALUE base_const = rb_mKernel;
char* endptr;
char* ptr = RSTRING_PTR(ruby_class_name);
while((endptr = strstr(ptr,"::"))) {
endptr[0] = '\0'; // NULL terminate to make string ops work
base_const = rb_const_get(base_const, rb_intern(ptr));
endptr[0] = ':'; // Restore correct char
ptr = endptr + 2;
}
return rb_class_new_instance(0, NULL, rb_const_get(base_const, rb_intern(ptr)));
}
}
/*
* st_table iterator for populating a given object from a property hash
*/
static int mapping_populate_iter(st_data_t key_data, st_data_t val_data, st_data_t args_data) {
VALUE key = (VALUE)key_data;
VALUE val = (VALUE)val_data;
const VALUE *args = (const VALUE *)args_data;
CLASS_MAPPING *map;
Data_Get_Struct(args[0], CLASS_MAPPING, map);
VALUE obj = args[1];
if(TYPE(obj) == T_HASH) {
rb_hash_aset(obj, key, val);
return ST_CONTINUE;
}
if(TYPE(key) != T_SYMBOL) rb_raise(rb_eArgError, "Invalid type for property key: %d", TYPE(key));
// Calculate symbol for setter function
ID key_id = SYM2ID(key);
ID setter_id;
if(!st_lookup(map->setter_cache, key_id, &setter_id)) {
// Calculate symbol
const char* key_str = rb_id2name(key_id);
long len = strlen(key_str);
char* setter = ALLOC_N(char, len+2);
memcpy(setter, key_str, len);
setter[len] = '=';
setter[len+1] = '\0';
setter_id = rb_intern(setter);
xfree(setter);
// Store it
st_add_direct(map->setter_cache, key_id, setter_id);
}
if(rb_respond_to(obj, setter_id)) {
rb_funcall(obj, setter_id, 1, val);
} else if(rb_respond_to(obj, id_hashset)) {
rb_funcall(obj, id_hashset, 2, key, val);
}
return ST_CONTINUE;
}
/*
* call-seq:
* mapper.populate_ruby_obj(obj, props, dynamic_props=nil) => obj
*
* Populates the ruby object using the given properties. Property hashes MUST
* have symbol keys, or it will raise an exception.
*/
static VALUE mapping_populate(int argc, VALUE *argv, VALUE self) {
// Check args
VALUE obj, props, dynamic_props;
rb_scan_args(argc, argv, "21", &obj, &props, &dynamic_props);
VALUE args[2] = {self, obj};
st_foreach(RHASH_TBL(props), mapping_populate_iter, (st_data_t)args);
if(dynamic_props != Qnil) {
st_foreach(RHASH_TBL(dynamic_props), mapping_populate_iter, (st_data_t)args);
}
return obj;
}
/*
* call-seq:
* mapper.props_for_serialization(obj) => hash
*
* Extracts all exportable properties from the given ruby object and returns
* them in a hash. For performance purposes, property detection is only performed
* once for a given class instance, and then cached for all instances of that
* class. IF YOU'RE ADDING AND REMOVING PROPERTIES FROM CLASS INSTANCES YOU
* CANNOT USE THE FAST CLASS MAPPER.
*/
static VALUE mapping_props(VALUE self, VALUE obj) {
CLASS_MAPPING *map;
Data_Get_Struct(self, CLASS_MAPPING, map);
if(TYPE(obj) == T_HASH) {
return obj;
}
// Get "properties"
VALUE props_ary;
VALUE klass = CLASS_OF(obj);
long i, len;
if(!st_lookup(map->prop_cache, klass, &props_ary)) {
props_ary = rb_ary_new();
// Build props array
VALUE all_methods = rb_class_public_instance_methods(0, NULL, klass);
VALUE object_methods = rb_class_public_instance_methods(0, NULL, rb_cObject);
VALUE possible_methods = rb_funcall(all_methods, rb_intern("-"), 1, object_methods);
len = RARRAY_LEN(possible_methods);
for(i = 0; i < len; i++) {
VALUE meth = rb_obj_method(obj, RARRAY_PTR(possible_methods)[i]);
VALUE arity = rb_funcall(meth, rb_intern("arity"), 0);
if(FIX2INT(arity) == 0) {
rb_ary_push(props_ary, RARRAY_PTR(possible_methods)[i]);
}
}
// Store it
st_add_direct(map->prop_cache, klass, props_ary);
}
// Build properties hash using list of properties
VALUE props = rb_hash_new();
len = RARRAY_LEN(props_ary);
for(i = 0; i < len; i++) {
VALUE key = RARRAY_PTR(props_ary)[i];
ID getter = (TYPE(key) == T_STRING) ? rb_intern(RSTRING_PTR(key)) : SYM2ID(key);
rb_hash_aset(props, key, rb_funcall(obj, getter, 0));
}
return props;
}
void Init_rocket_amf_fast_class_mapping() {
// Define map set
cFastMappingSet = rb_define_class_under(mRocketAMFExt, "FastMappingSet", rb_cObject);
rb_define_alloc_func(cFastMappingSet, mapset_alloc);
rb_define_method(cFastMappingSet, "initialize", mapset_init, 0);
rb_define_method(cFastMappingSet, "map_defaults", mapset_map_defaults, 0);
rb_define_method(cFastMappingSet, "map", mapset_map, 1);
// Define FastClassMapping
VALUE cFastClassMapping = rb_define_class_under(mRocketAMFExt, "FastClassMapping", rb_cObject);
rb_define_alloc_func(cFastClassMapping, mapping_alloc);
rb_define_singleton_method(cFastClassMapping, "use_array_collection", mapping_s_array_collection_get, 0);
rb_define_singleton_method(cFastClassMapping, "use_array_collection=", mapping_s_array_collection_set, 1);
rb_define_singleton_method(cFastClassMapping, "mappings", mapping_s_mappings, 0);
rb_define_singleton_method(cFastClassMapping, "reset", mapping_s_reset, 0);
rb_define_singleton_method(cFastClassMapping, "define", mapping_s_define, 0);
rb_define_attr(cFastClassMapping, "use_array_collection", 1, 0);
rb_define_method(cFastClassMapping, "initialize", mapping_init, 0);
rb_define_method(cFastClassMapping, "get_as_class_name", mapping_as_class_name, 1);
rb_define_method(cFastClassMapping, "get_ruby_obj", mapping_get_ruby_obj, 1);
rb_define_method(cFastClassMapping, "populate_ruby_obj", mapping_populate, -1);
rb_define_method(cFastClassMapping, "props_for_serialization", mapping_props, 1);
// Cache values
cTypedHash = rb_const_get(rb_const_get(mRocketAMF, rb_intern("Values")), rb_intern("TypedHash"));
id_use_ac = rb_intern("use_array_collection");
id_use_ac_ivar = rb_intern("@use_array_collection");
id_mappings = rb_intern("mappings");
id_mappings_ivar = rb_intern("@mappings");
id_hashset = rb_intern("[]=");
}

View file

@ -0,0 +1,52 @@
// AMF0 Type Markers
#define AMF0_NUMBER_MARKER 0x00
#define AMF0_BOOLEAN_MARKER 0x01
#define AMF0_STRING_MARKER 0x02
#define AMF0_OBJECT_MARKER 0x03
#define AMF0_MOVIE_CLIP_MARKER 0x04
#define AMF0_NULL_MARKER 0x05
#define AMF0_UNDEFINED_MARKER 0x06
#define AMF0_REFERENCE_MARKER 0x07
#define AMF0_HASH_MARKER 0x08
#define AMF0_OBJECT_END_MARKER 0x09
#define AMF0_STRICT_ARRAY_MARKER 0x0A
#define AMF0_DATE_MARKER 0x0B
#define AMF0_LONG_STRING_MARKER 0x0C
#define AMF0_UNSUPPORTED_MARKER 0x0D
#define AMF0_RECORDSET_MARKER 0x0E
#define AMF0_XML_MARKER 0x0F
#define AMF0_TYPED_OBJECT_MARKER 0x10
#define AMF0_AMF3_MARKER 0x11
// AMF3 Type Markers
#define AMF3_UNDEFINED_MARKER 0x00
#define AMF3_NULL_MARKER 0x01
#define AMF3_FALSE_MARKER 0x02
#define AMF3_TRUE_MARKER 0x03
#define AMF3_INTEGER_MARKER 0x04
#define AMF3_DOUBLE_MARKER 0x05
#define AMF3_STRING_MARKER 0x06
#define AMF3_XML_DOC_MARKER 0x07
#define AMF3_DATE_MARKER 0x08
#define AMF3_ARRAY_MARKER 0x09
#define AMF3_OBJECT_MARKER 0x0A
#define AMF3_XML_MARKER 0x0B
#define AMF3_BYTE_ARRAY_MARKER 0x0C
#define AMF3_VECTOR_INT_MARKER 0x0D
#define AMF3_VECTOR_UINT_MARKER 0x0E
#define AMF3_VECTOR_DOUBLE_MARKER 0x0F
#define AMF3_VECTOR_OBJECT_MARKER 0x10
#define AMF3_DICT_MARKER 0x11
// Other AMF3 Markers
#define AMF3_EMPTY_STRING 0x01
#define AMF3_DYNAMIC_OBJECT 0x0B
#define AMF3_CLOSE_DYNAMIC_OBJECT 0x01
#define AMF3_CLOSE_DYNAMIC_ARRAY 0x01
// Other Constants
#define MAX_INTEGER 268435455
#define MIN_INTEGER -268435456
#define INITIAL_STREAM_LENGTH 128 // Initial buffer length for serializer output
#define MAX_STREAM_LENGTH 10*1024*1024 // Let's cap it at 10MB for now
#define MAX_ARRAY_PREALLOC 100000

View file

@ -0,0 +1,776 @@
#include "deserializer.h"
#include "constants.h"
#define DES_BOUNDS_CHECK(des, i) if(des->pos + (i) > des->size || des->pos + (i) < des->pos) rb_raise(rb_eRangeError, "reading %lu bytes is beyond end of source: %ld (pos), %ld (size)", (unsigned long)(i), des->pos, des->size);
extern VALUE mRocketAMF;
extern VALUE mRocketAMFExt;
extern VALUE cDeserializer;
extern VALUE cStringIO;
extern VALUE sym_class_name;
extern VALUE sym_members;
extern VALUE sym_externalizable;
extern VALUE sym_dynamic;
ID id_get_ruby_obj;
ID id_populate_ruby_obj;
static VALUE des0_deserialize(VALUE self, char type);
static VALUE des3_deserialize(VALUE self);
char des_read_byte(AMF_DESERIALIZER *des) {
DES_BOUNDS_CHECK(des, 1);
des->pos++;
return des->stream[des->pos-1];
}
char des_read_ahead_byte(AMF_DESERIALIZER *des) {
DES_BOUNDS_CHECK(des, 1);
return des->stream[des->pos];
}
int des_read_uint16(AMF_DESERIALIZER *des) {
DES_BOUNDS_CHECK(des, 2);
const unsigned char *str = (unsigned char*)(des->stream) + des->pos;
des->pos += 2;
return ((str[0] << 8) | str[1]);
}
unsigned int des_read_uint32(AMF_DESERIALIZER *des) {
DES_BOUNDS_CHECK(des, 4);
const unsigned char *str = (unsigned char*)(des->stream) + des->pos;
des->pos += 4;
return ((str[0] << 24) | (str[1] << 16) | (str[2] << 8) | str[3]);
}
/*
* Read a network double
*/
double des_read_double(AMF_DESERIALIZER *des) {
DES_BOUNDS_CHECK(des, 8);
union aligned {
double dval;
char cval[8];
} d;
const char *str = des->stream + des->pos;
des->pos +=8;
#ifdef WORDS_BIGENDIAN
memcpy(d.cval, str, 8);
#else
d.cval[0] = str[7];
d.cval[1] = str[6];
d.cval[2] = str[5];
d.cval[3] = str[4];
d.cval[4] = str[3];
d.cval[5] = str[2];
d.cval[6] = str[1];
d.cval[7] = str[0];
#endif
return d.dval;
}
/*
* Read an AMF3 style integer
*/
int des_read_int(AMF_DESERIALIZER *des) {
int result = 0, byte_cnt = 0;
DES_BOUNDS_CHECK(des, 1);
unsigned char byte = des->stream[des->pos++];
while(byte & 0x80 && byte_cnt < 3) {
result <<= 7;
result |= byte & 0x7f;
DES_BOUNDS_CHECK(des, 1);
byte = des->stream[des->pos++];
byte_cnt++;
}
if (byte_cnt < 3) {
result <<= 7;
result |= byte & 0x7F;
} else {
result <<= 8;
result |= byte & 0xff;
}
if (result & 0x10000000) {
result -= 0x20000000;
}
return result;
}
/*
* Read a string and then force the encoding to UTF 8 if running ruby 1.9
*/
VALUE des_read_string(AMF_DESERIALIZER *des, unsigned int len) {
DES_BOUNDS_CHECK(des, len);
VALUE str = rb_str_new(des->stream + des->pos, len);
#ifdef HAVE_RB_STR_ENCODE
rb_encoding *utf8 = rb_utf8_encoding();
rb_enc_associate(str, utf8);
ENC_CODERANGE_CLEAR(str);
#endif
des->pos += len;
return str;
}
/*
* Set the source of the amf reader to a StringIO object, creating a new one to
* wrap the source if it's only a string
*/
void des_set_src(AMF_DESERIALIZER *des, VALUE src) {
VALUE klass = CLASS_OF(src);
if(klass == cStringIO) {
VALUE str = rb_funcall(src, rb_intern("string"), 0);
des->src = src;
des->stream = RSTRING_PTR(str);
des->pos = NUM2LONG(rb_funcall(src, rb_intern("pos"), 0));
des->size = RSTRING_LEN(str);
} else if(klass == rb_cString) {
VALUE args[1] = {src};
des->src = rb_class_new_instance(1, args, cStringIO);
des->stream = RSTRING_PTR(src);
des->pos = 0;
des->size = RSTRING_LEN(src);
} else {
rb_raise(rb_eArgError, "Invalid source type to deserialize from");
}
if(des->pos >= des->size) rb_raise(rb_eRangeError, "already at the end of the source");
}
/*
* Create AMF3 deserializer and copy source data over to it, before calling
* AMF3 internal deserialize function
*/
static VALUE des0_read_amf3(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
des->version = 3;
des->str_cache = rb_ary_new();
des->trait_cache = rb_ary_new();
return des3_deserialize(self);
}
/*
* Reads an AMF0 hash
*/
static void des0_read_props(VALUE self, VALUE hash) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
while(1) {
int len = des_read_uint16(des);
if(len == 0 && des_read_ahead_byte(des) == AMF0_OBJECT_END_MARKER) {
// Don't create a ruby string if this is really the object end
des_read_byte(des); // Read type byte
return;
} else {
VALUE key = des_read_string(des, len);
char type = des_read_byte(des);
rb_hash_aset(hash, key, des0_deserialize(self, type));
}
}
}
static VALUE des0_read_object(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
// Create object and add to cache
VALUE obj = rb_funcall(des->class_mapper, id_get_ruby_obj, 1, rb_str_new(NULL, 0));
rb_ary_push(des->obj_cache, obj);
// Populate object
VALUE props = rb_hash_new();
des0_read_props(self, props);
rb_funcall(des->class_mapper, id_populate_ruby_obj, 2, obj, props);
return obj;
}
static VALUE des0_read_typed_object(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
// Create object and add to cache
VALUE class_name = des_read_string(des, des_read_uint16(des));
VALUE obj = rb_funcall(des->class_mapper, id_get_ruby_obj, 1, class_name);
rb_ary_push(des->obj_cache, obj);
// Populate object
VALUE props = rb_hash_new();
des0_read_props(self, props);
rb_funcall(des->class_mapper, id_populate_ruby_obj, 2, obj, props);
return obj;
}
static VALUE des0_read_hash(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
des_read_uint32(des); // Hash size, but there's no optimization I can perform with this
VALUE obj = rb_hash_new();
rb_ary_push(des->obj_cache, obj);
des0_read_props(self, obj);
return obj;
}
static VALUE des0_read_array(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
// Limit size of pre-allocation to force remote user to actually send data,
// rather than just sending a size of 2**32-1 and nothing afterwards to
// crash the server
unsigned int len = des_read_uint32(des);
VALUE ary = rb_ary_new2(len < MAX_ARRAY_PREALLOC ? len : MAX_ARRAY_PREALLOC);
rb_ary_push(des->obj_cache, ary);
unsigned int i;
for(i = 0; i < len; i++) {
rb_ary_push(ary, des0_deserialize(self, des_read_byte(des)));
}
return ary;
}
static VALUE des0_read_time(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
double milli = des_read_double(des);
des_read_uint16(des); // Timezone - unused
time_t sec = milli/1000.0;
time_t micro = (milli-sec*1000)*1000;
return rb_time_new(sec, micro);
}
/*
* Internal C deserialize call. Takes deserializer and a char for the type
* marker.
*/
static VALUE des0_deserialize(VALUE self, char type) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
long tmp;
VALUE ret = Qnil;
switch(type) {
case AMF0_STRING_MARKER:
ret = des_read_string(des, des_read_uint16(des));
break;
case AMF0_AMF3_MARKER:
ret = des0_read_amf3(self);
break;
case AMF0_NUMBER_MARKER:
ret = rb_float_new(des_read_double(des));
break;
case AMF0_BOOLEAN_MARKER:
ret = des_read_byte(des) == 0 ? Qfalse : Qtrue;
break;
case AMF0_NULL_MARKER:
case AMF0_UNDEFINED_MARKER:
case AMF0_UNSUPPORTED_MARKER:
ret = Qnil;
break;
case AMF0_OBJECT_MARKER:
ret = des0_read_object(self);
break;
case AMF0_TYPED_OBJECT_MARKER:
ret = des0_read_typed_object(self);
break;
case AMF0_HASH_MARKER:
ret = des0_read_hash(self);
break;
case AMF0_STRICT_ARRAY_MARKER:
ret = des0_read_array(self);
break;
case AMF0_REFERENCE_MARKER:
tmp = des_read_uint16(des);
if(tmp >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "reference index beyond end");
ret = RARRAY_PTR(des->obj_cache)[tmp];
break;
case AMF0_DATE_MARKER:
ret = des0_read_time(self);
break;
case AMF0_XML_MARKER:
case AMF0_LONG_STRING_MARKER:
ret = des_read_string(des, des_read_uint32(des));
break;
default:
rb_raise(rb_eRuntimeError, "Not supported: %d", type);
break;
}
return ret;
}
static VALUE des3_read_string(AMF_DESERIALIZER *des) {
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->str_cache)) rb_raise(rb_eRangeError, "str reference index beyond end");
return RARRAY_PTR(des->str_cache)[header];
} else {
VALUE str = des_read_string(des, header >> 1);
if(RSTRING_LEN(str) > 0) rb_ary_push(des->str_cache, str);
return str;
}
}
/*
* Same as des3_read_string, but XML uses the object cache, rather than the
* string cache
*/
static VALUE des3_read_xml(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
VALUE str = des_read_string(des, header >> 1);
if(RSTRING_LEN(str) > 0) rb_ary_push(des->obj_cache, str);
return str;
}
}
static VALUE des3_read_object(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
VALUE externalizable, dynamic, members, class_name, traits;
long i, members_len;
// Parse traits
header >>= 1;
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->trait_cache)) rb_raise(rb_eRangeError, "trait reference index beyond end");
traits = RARRAY_PTR(des->trait_cache)[header];
externalizable = rb_hash_aref(traits, sym_externalizable);
dynamic = rb_hash_aref(traits, sym_dynamic);
members = rb_hash_aref(traits, sym_members);
members_len = members == Qnil ? 0 : RARRAY_LEN(members);
class_name = rb_hash_aref(traits, sym_class_name);
} else {
externalizable = (header & 2) != 0 ? Qtrue : Qfalse;
dynamic = (header & 4) != 0 ? Qtrue : Qfalse;
members_len = header >> 3;
class_name = des3_read_string(des);
members = rb_ary_new2(members_len);
for(i = 0; i < members_len; i++) rb_ary_push(members, des3_read_string(des));
traits = rb_hash_new();
rb_hash_aset(traits, sym_externalizable, externalizable);
rb_hash_aset(traits, sym_dynamic, dynamic);
rb_hash_aset(traits, sym_members, members);
rb_hash_aset(traits, sym_class_name, class_name);
rb_ary_push(des->trait_cache, traits);
}
// Optimization for deserializing ArrayCollection
if(strcmp(RSTRING_PTR(class_name), "flex.messaging.io.ArrayCollection") == 0) {
VALUE arr = des3_deserialize(self); // Adds ArrayCollection array to object cache automatically
rb_ary_push(des->obj_cache, arr); // Add again for ArrayCollection source array
return arr;
}
VALUE obj = rb_funcall(des->class_mapper, id_get_ruby_obj, 1, class_name);
rb_ary_push(des->obj_cache, obj);
if(externalizable == Qtrue) {
rb_funcall(des->src, rb_intern("pos="), 1, LONG2NUM(des->pos)); // Update source StringIO pos
rb_funcall(obj, rb_intern("read_external"), 1, self);
des->pos = NUM2LONG(rb_funcall(des->src, rb_intern("pos"), 0)); // Update from source
return obj;
}
VALUE props = rb_hash_new();
for(i = 0; i < members_len; i++) {
rb_hash_aset(props, RARRAY_PTR(members)[i], des3_deserialize(self));
}
VALUE dynamic_props = Qnil;
if(dynamic == Qtrue) {
dynamic_props = rb_hash_new();
while(1) {
VALUE key = des3_read_string(des);
if(RSTRING_LEN(key) == 0) break;
rb_hash_aset(dynamic_props, key, des3_deserialize(self));
}
}
rb_funcall(des->class_mapper, id_populate_ruby_obj, 3, obj, props, dynamic_props);
return obj;
}
}
static VALUE des3_read_array(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int i;
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
header >>= 1;
VALUE obj;
VALUE key = des3_read_string(des);
if(key == Qnil) rb_raise(rb_eRangeError, "key is Qnil");
if(RSTRING_LEN(key) != 0) {
obj = rb_hash_new();
rb_ary_push(des->obj_cache, obj);
while(RSTRING_LEN(key) != 0) {
rb_hash_aset(obj, key, des3_deserialize(self));
key = des3_read_string(des);
}
for(i = 0; i < header; i++) {
rb_hash_aset(obj, INT2FIX(i), des3_deserialize(self));
}
} else {
// Limit size of pre-allocation to force remote user to actually send data,
// rather than just sending a size of 2**32-1 and nothing afterwards to
// crash the server
obj = rb_ary_new2(header < MAX_ARRAY_PREALLOC ? header : MAX_ARRAY_PREALLOC);
rb_ary_push(des->obj_cache, obj);
for(i = 0; i < header; i++) {
rb_ary_push(obj, des3_deserialize(self));
}
}
return obj;
}
}
static VALUE des3_read_time(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
double milli = des_read_double(des);
time_t sec = milli/1000.0;
time_t micro = (milli-sec*1000)*1000;
VALUE time = rb_time_new(sec, micro);
rb_ary_push(des->obj_cache, time);
return time;
}
}
static VALUE des3_read_byte_array(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
header >>= 1;
VALUE args[1] = {des_read_string(des, header)};
#ifdef HAVE_RB_STR_ENCODE
// Need to force encoding to ASCII-8BIT
rb_encoding *ascii = rb_ascii8bit_encoding();
rb_enc_associate(args[0], ascii);
ENC_CODERANGE_CLEAR(args[0]);
#endif
VALUE ba = rb_class_new_instance(1, args, cStringIO);
rb_ary_push(des->obj_cache, ba);
return ba;
}
}
static VALUE des3_read_dict(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
header >>= 1;
VALUE dict = rb_hash_new();
rb_ary_push(des->obj_cache, dict);
des_read_byte(des); // Weak Keys: Not supported in ruby
int i;
for(i = 0; i < header; i++) {
VALUE key = des3_deserialize(self);
VALUE val = des3_deserialize(self);
rb_hash_aset(dict, key, val);
}
return dict;
}
}
static VALUE des3_read_vec(VALUE self, char type) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
int header = des_read_int(des);
if((header & 1) == 0) {
header >>= 1;
if(header >= RARRAY_LEN(des->obj_cache)) rb_raise(rb_eRangeError, "obj reference index beyond end");
return RARRAY_PTR(des->obj_cache)[header];
} else {
header >>= 1;
// Limit size of pre-allocation to force remote user to actually send data,
// rather than just sending a size of 2**32-1 and nothing afterwards to
// crash the server
VALUE vec = rb_ary_new2(header < MAX_ARRAY_PREALLOC ? header : MAX_ARRAY_PREALLOC);
rb_ary_push(des->obj_cache, vec);
des_read_byte(des); // Fixed Length: Not supported in ruby
// On 32-bit ARCH, FIXNUM has a limit of 2**31-1, resulting in truncation of large ints/uints
int i;
switch(type) {
case AMF3_VECTOR_INT_MARKER:
for(i = 0; i < header; i++) {
int ival = des_read_uint32(des);
rb_ary_push(vec, INT2FIX(ival));
}
break;
case AMF3_VECTOR_UINT_MARKER:
for(i = 0; i < header; i++) {
rb_ary_push(vec, INT2FIX(des_read_uint32(des)));
}
break;
case AMF3_VECTOR_DOUBLE_MARKER:
for(i = 0; i < header; i++) {
rb_ary_push(vec, rb_float_new(des_read_double(des)));
}
break;
case AMF3_VECTOR_OBJECT_MARKER:
des3_read_string(des); // Class name of objects - ignored
for(i = 0; i < header; i++) {
rb_ary_push(vec, des3_deserialize(self));
}
break;
}
return vec;
}
}
/*
* Internal deserialize call - unlike des0_deserialize, it reads the type
* itself, due to minor changes in the specs that make that modification
* unnecessary.
*/
static VALUE des3_deserialize(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
char type = des_read_byte(des);
VALUE ret = Qnil;
switch(type) {
case AMF3_UNDEFINED_MARKER:
case AMF3_NULL_MARKER:
ret = Qnil;
break;
case AMF3_FALSE_MARKER:
ret = Qfalse;
break;
case AMF3_TRUE_MARKER:
ret = Qtrue;
break;
case AMF3_INTEGER_MARKER:
ret = INT2FIX(des_read_int(des));
break;
case AMF3_DOUBLE_MARKER:
ret = rb_float_new(des_read_double(des));
break;
case AMF3_STRING_MARKER:
ret = des3_read_string(des);
break;
case AMF3_ARRAY_MARKER:
ret = des3_read_array(self);
break;
case AMF3_OBJECT_MARKER:
ret = des3_read_object(self);
break;
case AMF3_DATE_MARKER:
ret = des3_read_time(self);
break;
case AMF3_XML_DOC_MARKER:
case AMF3_XML_MARKER:
ret = des3_read_xml(self);
break;
case AMF3_BYTE_ARRAY_MARKER:
ret = des3_read_byte_array(self);
break;
case AMF3_VECTOR_INT_MARKER:
case AMF3_VECTOR_UINT_MARKER:
case AMF3_VECTOR_DOUBLE_MARKER:
case AMF3_VECTOR_OBJECT_MARKER:
ret = des3_read_vec(self, type);
break;
case AMF3_DICT_MARKER:
ret = des3_read_dict(self);
break;
default:
rb_raise(rb_eRuntimeError, "Not supported: %d", type);
break;
}
return ret;
}
/*
* Mark the reader and its source. If caches are populated mark them as well.
*/
static void des_mark(AMF_DESERIALIZER *des) {
if(!des) return;
rb_gc_mark(des->class_mapper);
rb_gc_mark(des->src);
if(des->obj_cache) rb_gc_mark(des->obj_cache);
if(des->str_cache) rb_gc_mark(des->str_cache);
if(des->trait_cache) rb_gc_mark(des->trait_cache);
}
/*
* Free the reader. Don't need to free anything but the struct because we didn't
* alloc anything - source is from the ruby source object.
*/
static void des_free(AMF_DESERIALIZER *des) {
xfree(des);
}
/*
* Create new struct and wrap with class
*/
static VALUE des_alloc(VALUE klass) {
AMF_DESERIALIZER *des = ALLOC(AMF_DESERIALIZER);
memset(des, 0, sizeof(AMF_DESERIALIZER));
return Data_Wrap_Struct(klass, des_mark, des_free, des);
}
/*
* Initializer
*/
static VALUE des_initialize(VALUE self, VALUE class_mapper) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
des->class_mapper = class_mapper;
return self;
}
/*
* call-seq:
* ser.stream => StringIO
*
* Returns the source that the deserializer is reading from
*/
static VALUE des_source(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
return des->src;
}
/*
* call-seq:
* des.deserialize(amf_ver, str) => obj
* des.deserialize(amf_ver, StringIO) => obj
*
* Deserialize the string or StringIO from AMF to a ruby object.
*/
VALUE des_deserialize(VALUE self, VALUE ver, VALUE src) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
// Process version
int int_ver = FIX2INT(ver);
if(int_ver != 0 && int_ver != 3) rb_raise(rb_eArgError, "unsupported version %d", int_ver);
des->version = int_ver;
// Process source
if(src != Qnil) {
des_set_src(des, src);
} else if(!des->src) {
rb_raise(rb_eArgError, "Missing deserialization source");
}
// Deserialize from source
VALUE ret;
if(des->version == 0) {
des->obj_cache = rb_ary_new();
ret = des0_deserialize(self, des_read_byte(des));
} else {
des->obj_cache = rb_ary_new();
des->str_cache = rb_ary_new();
des->trait_cache = rb_ary_new();
ret = des3_deserialize(self);
}
// Update source position
rb_funcall(des->src, rb_intern("pos="), 1, LONG2NUM(des->pos)); // Update source StringIO pos
return ret;
}
/*
* call-seq:
* des.read_object => obj
*
* Reads an object from the deserializer's stream and returns it.
*/
VALUE des_read_object(VALUE self) {
AMF_DESERIALIZER *des;
Data_Get_Struct(self, AMF_DESERIALIZER, des);
// Update internal pos from source in case they've modified it
des->pos = NUM2LONG(rb_funcall(des->src, rb_intern("pos"), 0));
// Deserialize
VALUE ret;
if(des->version == 0) {
ret = des0_deserialize(self, des_read_byte(des));
} else {
ret = des3_deserialize(self);
}
// Update source position
rb_funcall(des->src, rb_intern("pos="), 1, LONG2NUM(des->pos)); // Update source StringIO pos
return ret;
}
void Init_rocket_amf_deserializer() {
// Define Deserializer
cDeserializer = rb_define_class_under(mRocketAMFExt, "Deserializer", rb_cObject);
rb_define_alloc_func(cDeserializer, des_alloc);
rb_define_method(cDeserializer, "initialize", des_initialize, 1);
rb_define_method(cDeserializer, "source", des_source, 0);
rb_define_method(cDeserializer, "deserialize", des_deserialize, 2);
rb_define_method(cDeserializer, "read_object", des_read_object, 0);
// Get refs to commonly used symbols and ids
id_get_ruby_obj = rb_intern("get_ruby_obj");
id_populate_ruby_obj = rb_intern("populate_ruby_obj");
}

View file

@ -0,0 +1,28 @@
#include <ruby.h>
#ifdef HAVE_RB_STR_ENCODE
#include <ruby/encoding.h>
#endif
typedef struct {
int version;
VALUE class_mapper;
VALUE src;
char* stream;
unsigned long pos;
unsigned long size;
VALUE obj_cache;
VALUE str_cache;
VALUE trait_cache;
} AMF_DESERIALIZER;
char des_read_byte(AMF_DESERIALIZER *des);
char des_read_ahead_byte(AMF_DESERIALIZER *des);
int des_read_uint16(AMF_DESERIALIZER *des);
unsigned int des_read_uint32(AMF_DESERIALIZER *des);
double des_read_double(AMF_DESERIALIZER *des);
int des_read_int(AMF_DESERIALIZER *des);
VALUE des_read_string(AMF_DESERIALIZER *des, unsigned int len);
VALUE des_read_sym(AMF_DESERIALIZER *des, unsigned int len);
void des_set_src(AMF_DESERIALIZER *des, VALUE src);
VALUE des_deserialize(VALUE self, VALUE ver, VALUE src);

View file

@ -0,0 +1,18 @@
require 'mkmf'
# Disable the native extension by creating an empty Makefile on JRuby
if defined? JRUBY_VERSION
message "Generating phony Makefile for JRuby so the gem installs"
mfile = File.join(File.dirname(__FILE__), 'Makefile')
File.open(mfile, 'w') {|f| f.write dummy_makefile(File.dirname(__FILE__)) }
exit 0
end
if enable_config("sort-props", false)
$defs.push("-DSORT_PROPS") unless $defs.include? "-DSORT_PROPS"
end
have_func('rb_str_encode')
$CFLAGS += " -Wall"
create_makefile('rocketamf_ext')

View file

@ -0,0 +1,184 @@
#include "deserializer.h"
#include "serializer.h"
#include "constants.h"
extern VALUE mRocketAMF;
extern VALUE mRocketAMFExt;
extern VALUE cDeserializer;
extern VALUE cSerializer;
VALUE cRocketAMFHeader;
VALUE cRocketAMFMessage;
VALUE cRocketAMFAbstractMessage;
ID id_amf_version;
ID id_headers;
ID id_messages;
ID id_data;
/*
* call-seq:
* env.populate_from_stream(stream, class_mapper=nil)
*
* Included into RocketAMF::Envelope, this method handles deserializing an AMF
* request/response into the envelope
*/
static VALUE env_populate_from_stream(int argc, VALUE *argv, VALUE self) {
static VALUE cClassMapper = 0;
if(cClassMapper == 0) cClassMapper = rb_const_get(mRocketAMF, rb_intern("ClassMapper"));
// Parse args
VALUE src;
VALUE class_mapper;
rb_scan_args(argc, argv, "11", &src, &class_mapper);
if(class_mapper == Qnil) class_mapper = rb_class_new_instance(0, NULL, cClassMapper);
// Create AMF0 deserializer
VALUE args[3];
args[0] = class_mapper;
VALUE des_rb = rb_class_new_instance(1, args, cDeserializer);
AMF_DESERIALIZER *des;
Data_Get_Struct(des_rb, AMF_DESERIALIZER, des);
des_set_src(des, src);
// Read amf version
int amf_ver = des_read_uint16(des);
// Read headers
VALUE headers = rb_hash_new();
int header_cnt = des_read_uint16(des);
int i;
for(i = 0; i < header_cnt; i++) {
VALUE name = des_read_string(des, des_read_uint16(des));
VALUE must_understand = des_read_byte(des) != 0 ? Qtrue : Qfalse;
des_read_uint32(des); // Length is ignored
VALUE data = des_deserialize(des_rb, INT2FIX(0), Qnil);
args[0] = name;
args[1] = must_understand;
args[2] = data;
rb_hash_aset(headers, name, rb_class_new_instance(3, args, cRocketAMFHeader));
}
// Read messages
VALUE messages = rb_ary_new();
int message_cnt = des_read_uint16(des);
for(i = 0; i < message_cnt; i++) {
VALUE target_uri = des_read_string(des, des_read_uint16(des));
VALUE response_uri = des_read_string(des, des_read_uint16(des));
des_read_uint32(des); // Length is ignored
VALUE data = des_deserialize(des_rb, INT2FIX(0), Qnil);
// If they're using the flex remoting APIs, remove array wrapper
if(TYPE(data) == T_ARRAY && RARRAY_LEN(data) == 1 && rb_obj_is_kind_of(RARRAY_PTR(data)[0], cRocketAMFAbstractMessage) == Qtrue) {
data = RARRAY_PTR(data)[0];
}
args[0] = target_uri;
args[1] = response_uri;
args[2] = data;
rb_ary_push(messages, rb_class_new_instance(3, args, cRocketAMFMessage));
}
// Populate remoting object
rb_ivar_set(self, id_amf_version, INT2FIX(amf_ver));
rb_ivar_set(self, id_headers, headers);
rb_ivar_set(self, id_messages, messages);
return self;
}
/*
* call-seq:
* env.serialize(class_mapper=nil)
*
* Included into RocketAMF::Envelope, this method handles serializing an AMF
* request/response into a string
*/
static VALUE env_serialize(int argc, VALUE *argv, VALUE self) {
static VALUE cClassMapper = 0;
if(cClassMapper == 0) cClassMapper = rb_const_get(mRocketAMF, rb_intern("ClassMapper"));
// Parse args
VALUE class_mapper;
rb_scan_args(argc, argv, "01", &class_mapper);
if(class_mapper == Qnil) class_mapper = rb_class_new_instance(0, NULL, cClassMapper);
// Get instance variables
long amf_ver = FIX2LONG(rb_ivar_get(self, id_amf_version));
VALUE headers = rb_funcall(rb_ivar_get(self, id_headers), rb_intern("values"), 0); // Get array of header values
VALUE messages = rb_ivar_get(self, id_messages);
// Create AMF0 serializer
VALUE args[1] = {class_mapper};
VALUE ser_rb = rb_class_new_instance(1, args, cSerializer);
AMF_SERIALIZER *ser;
Data_Get_Struct(ser_rb, AMF_SERIALIZER, ser);
// Write version
ser_write_uint16(ser, amf_ver);
// Write headers
long header_cnt = RARRAY_LEN(headers);
ser_write_uint16(ser, header_cnt);
int i;
char *str;
long str_len;
for(i = 0; i < header_cnt; i++) {
VALUE header = RARRAY_PTR(headers)[i];
// Write header name
ser_get_string(rb_funcall(header, rb_intern("name"), 0), Qtrue, &str, &str_len);
ser_write_uint16(ser, str_len);
rb_str_buf_cat(ser->stream, str, str_len);
// Write understand flag
ser_write_byte(ser, rb_funcall(header, rb_intern("must_understand"), 0) == Qtrue ? 1 : 0);
// Serialize data
ser_write_uint32(ser, -1); // length of data - -1 if you don't know
ser_serialize(ser_rb, INT2FIX(0), rb_funcall(header, id_data, 0));
}
// Write messages
long message_cnt = RARRAY_LEN(messages);
ser_write_uint16(ser, message_cnt);
for(i = 0; i < message_cnt; i++) {
VALUE message = RARRAY_PTR(messages)[i];
// Write target_uri
ser_get_string(rb_funcall(message, rb_intern("target_uri"), 0), Qtrue, &str, &str_len);
ser_write_uint16(ser, str_len);
rb_str_buf_cat(ser->stream, str, str_len);
// Write response_uri
ser_get_string(rb_funcall(message, rb_intern("response_uri"), 0), Qtrue, &str, &str_len);
ser_write_uint16(ser, str_len);
rb_str_buf_cat(ser->stream, str, str_len);
// Serialize data
ser_write_uint32(ser, -1); // length of data - -1 if you don't know
if(amf_ver == 3) {
ser_write_byte(ser, AMF0_AMF3_MARKER);
ser_serialize(ser_rb, INT2FIX(3), rb_funcall(message, id_data, 0));
} else {
ser_serialize(ser_rb, INT2FIX(0), rb_funcall(message, id_data, 0));
}
}
return ser->stream;
}
void Init_rocket_amf_remoting() {
VALUE mEnvelope = rb_define_module_under(mRocketAMFExt, "Envelope");
rb_define_method(mEnvelope, "populate_from_stream", env_populate_from_stream, -1);
rb_define_method(mEnvelope, "serialize", env_serialize, -1);
// Get refs to commonly used symbols and ids
id_amf_version = rb_intern("@amf_version");
id_headers = rb_intern("@headers");
id_messages = rb_intern("@messages");
id_data = rb_intern("data");
cRocketAMFHeader = rb_const_get(mRocketAMF, rb_intern("Header"));
cRocketAMFMessage = rb_const_get(mRocketAMF, rb_intern("Message"));
cRocketAMFAbstractMessage = rb_const_get(rb_const_get(mRocketAMF, rb_intern("Values")), rb_intern("AbstractMessage"));
}

View file

@ -0,0 +1,38 @@
#include <ruby.h>
VALUE mRocketAMF;
VALUE mRocketAMFExt;
VALUE cDeserializer;
VALUE cSerializer;
VALUE cStringIO;
VALUE cDate;
VALUE cDateTime;
VALUE sym_class_name;
VALUE sym_members;
VALUE sym_externalizable;
VALUE sym_dynamic;
void Init_rocket_amf_deserializer();
void Init_rocket_amf_serializer();
void Init_rocket_amf_fast_class_mapping();
void Init_rocket_amf_remoting();
void Init_rocketamf_ext() {
mRocketAMF = rb_define_module("RocketAMF");
mRocketAMFExt = rb_define_module_under(mRocketAMF, "Ext");
// Set up classes
Init_rocket_amf_deserializer();
Init_rocket_amf_serializer();
Init_rocket_amf_fast_class_mapping();
Init_rocket_amf_remoting();
// Get refs to commonly used symbols and ids
cStringIO = rb_const_get(rb_cObject, rb_intern("StringIO"));
cDate = rb_const_get(rb_cObject, rb_intern("Date"));
cDateTime = rb_const_get(rb_cObject, rb_intern("DateTime"));
sym_class_name = ID2SYM(rb_intern("class_name"));
sym_members = ID2SYM(rb_intern("members"));
sym_externalizable = ID2SYM(rb_intern("externalizable"));
sym_dynamic = ID2SYM(rb_intern("dynamic"));
}

View file

@ -0,0 +1,834 @@
#include "serializer.h"
#include "constants.h"
#include "utility.h"
extern VALUE mRocketAMF;
extern VALUE mRocketAMFExt;
extern VALUE cSerializer;
extern VALUE cStringIO;
extern VALUE cDate;
extern VALUE cDateTime;
extern VALUE sym_class_name;
extern VALUE sym_members;
extern VALUE sym_externalizable;
extern VALUE sym_dynamic;
VALUE cArrayCollection;
ID id_haskey;
ID id_encode_amf;
ID id_is_array_collection;
ID id_use_array_collection;
ID id_get_as_class_name;
ID id_props_for_serialization;
ID id_utc;
ID id_to_f;
ID id_is_integer;
static VALUE ser0_serialize(VALUE self, VALUE obj);
static VALUE ser3_serialize(VALUE self, VALUE obj);
void ser_write_byte(AMF_SERIALIZER *ser, char byte) {
char bytes[2] = {byte, '\0'};
rb_str_buf_cat(ser->stream, bytes, 1);
}
void ser_write_int(AMF_SERIALIZER *ser, int num) {
char tmp[4];
int tmp_len;
num &= 0x1fffffff;
if (num < 0x80) {
tmp_len = 1;
tmp[0] = num;
} else if (num < 0x4000) {
tmp_len = 2;
tmp[0] = (num >> 7 & 0x7f) | 0x80;
tmp[1] = num & 0x7f;
} else if (num < 0x200000) {
tmp_len = 3;
tmp[0] = (num >> 14 & 0x7f) | 0x80;
tmp[1] = (num >> 7 & 0x7f) | 0x80;
tmp[2] = num & 0x7f;
} else if (num < 0x40000000) {
tmp_len = 4;
tmp[0] = (num >> 22 & 0x7f) | 0x80;
tmp[1] = (num >> 15 & 0x7f) | 0x80;
tmp[2] = (num >> 8 & 0x7f) | 0x80;
tmp[3] = (num & 0xff);
} else {
rb_raise(rb_eRangeError, "int %d out of range", num);
}
rb_str_buf_cat(ser->stream, tmp, tmp_len);
}
void ser_write_uint16(AMF_SERIALIZER *ser, long num) {
if(num > 0xffff) rb_raise(rb_eRangeError, "int %ld out of range", num);
char tmp[2] = {(num >> 8) & 0xff, num & 0xff};
rb_str_buf_cat(ser->stream, tmp, 2);
}
void ser_write_uint32(AMF_SERIALIZER *ser, long num) {
if(num > 0xffffffff) rb_raise(rb_eRangeError, "int %ld out of range", num);
char tmp[4] = {(num >> 24) & 0xff, (num >> 16) & 0xff, (num >> 8) & 0xff, num & 0xff};
rb_str_buf_cat(ser->stream, tmp, 4);
}
void ser_write_double(AMF_SERIALIZER *ser, double num) {
union aligned {
double dval;
char cval[8];
} d;
const char *number = d.cval;
d.dval = num;
#ifdef WORDS_BIGENDIAN
rb_str_buf_cat(ser->stream, number, 8);
#else
char netnum[8] = {number[7],number[6],number[5],number[4],number[3],number[2],number[1],number[0]};
rb_str_buf_cat(ser->stream, netnum, 8);
#endif
}
void ser_get_string(VALUE obj, VALUE encode, char** str, long* len) {
int type = TYPE(obj);
if(type == T_STRING) {
#ifdef HAVE_RB_STR_ENCODE
if(encode == Qtrue) {
rb_encoding *enc = rb_enc_get(obj);
if (enc != rb_ascii8bit_encoding()) {
rb_encoding *utf8 = rb_utf8_encoding();
if (enc != utf8) obj = rb_str_encode(obj, rb_enc_from_encoding(utf8), 0, Qnil);
}
}
#endif
*str = RSTRING_PTR(obj);
*len = RSTRING_LEN(obj);
} else if(type == T_SYMBOL) {
*str = (char*)rb_id2name(SYM2ID(obj));
*len = strlen(*str);
} else if(obj == Qnil) {
*len = 0;
} else {
rb_raise(rb_eArgError, "Invalid type in ser_get_string: %d", type);
}
}
/*
* Write the given array in AMF0 notation
*/
static void ser0_write_array(VALUE self, VALUE ary) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
// Cache it
st_add_direct(ser->obj_cache, ary, LONG2FIX(ser->obj_index));
ser->obj_index++;
// Write it out
long i, len = RARRAY_LEN(ary);
ser_write_byte(ser, AMF0_STRICT_ARRAY_MARKER);
ser_write_uint32(ser, len);
for(i = 0; i < len; i++) {
ser0_serialize(self, RARRAY_PTR(ary)[i]);
}
}
/*
* Supports writing strings and symbols. For hash keys, strings all have 16 bit
* lengths, so writing a type marker is unnecessary. In that case the third
* parameter should be set to Qfalse instead of Qtrue.
*/
static void ser0_write_string(AMF_SERIALIZER *ser, VALUE obj, VALUE write_marker) {
// Extract char array and length from object
char* str;
long len;
ser_get_string(obj, Qtrue, &str, &len);
// Write string
if(len > 0xffff) {
if(write_marker == Qtrue) ser_write_byte(ser, AMF0_LONG_STRING_MARKER);
ser_write_uint32(ser, len);
} else {
if(write_marker == Qtrue) ser_write_byte(ser, AMF0_STRING_MARKER);
ser_write_uint16(ser, len);
}
rb_str_buf_cat(ser->stream, str, len);
}
/*
* Hash iterator for object properties that writes the key and then serializes
* the value
*/
static int ser0_hash_iter(VALUE key, VALUE val, const VALUE args[1]) {
AMF_SERIALIZER *ser;
Data_Get_Struct(args[0], AMF_SERIALIZER, ser);
// Write key and value
ser0_write_string(ser, key, Qfalse); // Technically incorrect if key length is longer than a 16 bit string, but if you run into that you're screwed anyways
ser0_serialize(args[0], val);
return ST_CONTINUE;
}
/*
* Used for both hashes and objects. Takes the object and the props hash or Qnil,
* which forces a call to the class mapper for props for serialization. Prop
* sorting must be enabled by an explicit call to extconf.rb, so the tests will
* not pass typically on Ruby 1.8.
*/
static void ser0_write_object(VALUE self, VALUE obj, VALUE props) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
// Cache it
st_add_direct(ser->obj_cache, obj, LONG2FIX(ser->obj_index));
ser->obj_index++;
// Make a request for props hash unless we already have it
if(props == Qnil) {
props = rb_funcall(ser->class_mapper, id_props_for_serialization, 1, obj);
}
// Write header
VALUE class_name = rb_funcall(ser->class_mapper, id_get_as_class_name, 1, obj);
if(class_name != Qnil) {
ser_write_byte(ser, AMF0_TYPED_OBJECT_MARKER);
ser0_write_string(ser, class_name, Qfalse);
} else {
ser_write_byte(ser, AMF0_OBJECT_MARKER);
}
// Write out data
VALUE args[1] = {self};
#ifdef SORT_PROPS
// Sort is required prior to Ruby 1.9 to pass all the tests, as Ruby 1.8 hashes don't store insert order
VALUE sorted_props = rb_funcall(props, rb_intern("sort"), 0);
long i, len = RARRAY_LEN(sorted_props);
for(i = 0; i < len; i++) {
VALUE pair = RARRAY_PTR(sorted_props)[i];
ser0_hash_iter(RARRAY_PTR(pair)[0], RARRAY_PTR(pair)[1], args);
}
#else
rb_hash_foreach(props, ser0_hash_iter, (st_data_t)args);
#endif
ser_write_uint16(ser, 0);
ser_write_byte(ser, AMF0_OBJECT_END_MARKER);
}
static void ser0_write_time(VALUE self, VALUE time) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser_write_byte(ser, AMF0_DATE_MARKER);
// Write time
time = rb_obj_dup(time);
rb_funcall(time, id_utc, 0);
double tmp_num = NUM2DBL(rb_funcall(time, id_to_f, 0)) * 1000;
ser_write_double(ser, tmp_num);
ser_write_uint16(ser, 0); // Time zone
}
static void ser0_write_date(VALUE self, VALUE date) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser_write_byte(ser, AMF0_DATE_MARKER);
// Write time
double tmp_num = rb_str_to_dbl(rb_funcall(date, rb_intern("strftime"), 1, rb_str_new2("%Q")), Qfalse);
ser_write_double(ser, tmp_num);
ser_write_uint16(ser, 0); // Time zone
}
/*
* Serializes the object to a string and returns that string
*/
static VALUE ser0_serialize(VALUE self, VALUE obj) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
int type = TYPE(obj);
VALUE klass = Qnil;
if(type == T_OBJECT || type == T_DATA) {
klass = CLASS_OF(obj);
}
VALUE obj_index;
if(st_lookup(ser->obj_cache, obj, &obj_index)) {
ser_write_byte(ser, AMF0_REFERENCE_MARKER);
ser_write_uint16(ser, FIX2LONG(obj_index));
} else if(rb_respond_to(obj, id_encode_amf)) {
rb_funcall(obj, id_encode_amf, 1, self);
} else if(type == T_STRING || type == T_SYMBOL) {
ser0_write_string(ser, obj, Qtrue);
} else if(rb_obj_is_kind_of(obj, rb_cNumeric)) {
ser_write_byte(ser, AMF0_NUMBER_MARKER);
ser_write_double(ser, RFLOAT_VALUE(rb_Float(obj)));
} else if(type == T_NIL) {
ser_write_byte(ser, AMF0_NULL_MARKER);
} else if(type == T_TRUE || type == T_FALSE) {
ser_write_byte(ser, AMF0_BOOLEAN_MARKER);
ser_write_byte(ser, type == T_TRUE ? 1 : 0);
} else if(type == T_ARRAY) {
ser0_write_array(self, obj);
} else if(klass == rb_cTime) {
ser0_write_time(self, obj);
} else if(klass == cDate || klass == cDateTime) {
ser0_write_date(self, obj);
} else if(type == T_HASH || type == T_OBJECT) {
ser0_write_object(self, obj, Qnil);
}
return ser->stream;
}
/*
* Writes an AMF3 style string. Accepts strings, symbols, and nil, and handles
* all the necessary encoding and caching.
*/
static void ser3_write_utf8vr(AMF_SERIALIZER *ser, VALUE obj) {
// Extract char array and length from object
char* str;
long len;
ser_get_string(obj, Qtrue, &str, &len);
// Write string
VALUE str_index;
if(len == 0) {
ser_write_byte(ser, AMF3_EMPTY_STRING);
} else if(st_lookup(ser->str_cache, (st_data_t)str, &str_index)) {
ser_write_int(ser, FIX2INT(str_index) << 1);
} else {
st_add_direct(ser->str_cache, (st_data_t)strdup(str), LONG2FIX(ser->str_index));
ser->str_index++;
ser_write_int(ser, ((int)len) << 1 | 1);
rb_str_buf_cat(ser->stream, str, len);
}
}
/*
* Writes Numeric conforming object using AMF3 notation
*/
static void ser3_write_numeric(AMF_SERIALIZER *ser, VALUE num) {
// Is it an integer in range?
if(rb_funcall(num, id_is_integer, 0) == Qtrue) {
// It's an integer internally, so now we need to check if it's in range
VALUE int_obj = rb_Integer(num);
if(TYPE(int_obj) == T_FIXNUM) {
long long_val = FIX2LONG(int_obj);
if(long_val < MIN_INTEGER || long_val > MAX_INTEGER) {
// Outside range, but we have a value already, so just cast to double
ser_write_byte(ser, AMF3_DOUBLE_MARKER);
ser_write_double(ser, (double)long_val);
} else {
// Inside valid integer range
ser_write_byte(ser, AMF3_INTEGER_MARKER);
ser_write_int(ser, (int)long_val);
}
return;
}
}
// It's either not an integer or out of range, so write as a double
ser_write_byte(ser, AMF3_DOUBLE_MARKER);
ser_write_double(ser, RFLOAT_VALUE(rb_Float(num)));
}
/*
* Writes the given array using AMF3 notation
*/
static void ser3_write_array(VALUE self, VALUE ary) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
// Is it an array collection?
VALUE is_ac = Qfalse;
if(rb_respond_to(ary, id_is_array_collection)) {
is_ac = rb_funcall(ary, id_is_array_collection, 0);
} else {
is_ac = rb_funcall(ser->class_mapper, id_use_array_collection, 0);
}
// Write type marker
ser_write_byte(ser, is_ac ? AMF3_OBJECT_MARKER : AMF3_ARRAY_MARKER);
// Write object ref, or cache it
VALUE obj_index;
if(st_lookup(ser->obj_cache, ary, &obj_index)) {
ser_write_int(ser, FIX2INT(obj_index) << 1);
return;
} else {
st_add_direct(ser->obj_cache, ary, LONG2FIX(ser->obj_index));
ser->obj_index++;
if(is_ac) ser->obj_index++; // The array collection source array
}
// Write out traits and array marker if it's an array collection
if(is_ac) {
VALUE trait_index;
char array_collection_name[34] = "flex.messaging.io.ArrayCollection";
if(st_lookup(ser->trait_cache, (st_data_t)array_collection_name, &trait_index)) {
ser_write_int(ser, FIX2INT(trait_index) << 2 | 0x01);
} else {
st_add_direct(ser->trait_cache, (st_data_t)strdup(array_collection_name), LONG2FIX(ser->trait_index));
ser->trait_index++;
ser_write_byte(ser, 0x07); // Trait header
ser3_write_utf8vr(ser, rb_str_new2(array_collection_name));
}
ser_write_byte(ser, AMF3_ARRAY_MARKER);
}
// Write header
int header = ((int)RARRAY_LEN(ary)) << 1 | 1;
ser_write_int(ser, header);
ser_write_byte(ser, AMF3_CLOSE_DYNAMIC_ARRAY);
// Write contents
long i, len = RARRAY_LEN(ary);
for(i = 0; i < len; i++) {
ser3_serialize(self, RARRAY_PTR(ary)[i]);
}
}
/*
* AMF3 property hash write iterator. Checks the args->extra hash, if given,
* and skips properties that are keys in that hash.
*/
static int ser3_hash_iter(VALUE key, VALUE val, const VALUE args[2]) {
AMF_SERIALIZER *ser;
Data_Get_Struct(args[0], AMF_SERIALIZER, ser);
if(args[1] == Qnil || rb_funcall(args[1], id_haskey, 1, key) == Qfalse) {
// Write key and value
ser3_write_utf8vr(ser, key);
ser3_serialize(args[0], val);
}
return ST_CONTINUE;
}
/*
* Used for both hashes and objects. Takes the object and the props hash or Qnil,
* which forces a call to the class mapper for props for serialization. Prop
* sorting must be enabled by an explicit call to extconf.rb, so the tests will
* not pass typically on Ruby 1.8. If you need to have specific traits, you can
* also pass that in, or pass Qnil to use the default traits - dynamic with no
* defined members.
*/
static void ser3_write_object(VALUE self, VALUE obj, VALUE props, VALUE traits) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
long i;
// Write type marker
ser_write_byte(ser, AMF3_OBJECT_MARKER);
// Write object ref, or cache it
VALUE obj_index;
if(st_lookup(ser->obj_cache, obj, &obj_index)) {
ser_write_int(ser, FIX2INT(obj_index) << 1);
return;
} else {
st_add_direct(ser->obj_cache, obj, LONG2FIX(ser->obj_index));
ser->obj_index++;
}
// Extract traits data, or use defaults
VALUE is_default = Qfalse;
VALUE class_name = Qnil;
VALUE members = Qnil;
long members_len = 0;
VALUE dynamic = Qtrue;
VALUE externalizable = Qfalse;
if(traits == Qnil) {
class_name = rb_funcall(ser->class_mapper, id_get_as_class_name, 1, obj);
if(class_name == Qnil) is_default = Qtrue;
} else {
class_name = rb_hash_aref(traits, sym_class_name);
members = rb_hash_aref(traits, sym_members);
if(members != Qnil) members_len = RARRAY_LEN(members);
dynamic = rb_hash_aref(traits, sym_dynamic);
externalizable = rb_hash_aref(traits, sym_externalizable);
}
// Handle trait caching
int did_ref = 0;
VALUE trait_index;
if(is_default == Qtrue || class_name != Qnil) {
const char *ref_class_name = is_default == Qtrue ? "__default__" : RSTRING_PTR(class_name);
if(st_lookup(ser->trait_cache, (st_data_t)ref_class_name, &trait_index)) {
ser_write_int(ser, FIX2INT(trait_index) << 2 | 0x01);
did_ref = 1;
} else {
st_add_direct(ser->trait_cache, (st_data_t)strdup(ref_class_name), LONG2FIX(ser->trait_index));
ser->trait_index++;
}
}
// Write traits outs if didn't write reference
if(!did_ref) {
// Write out trait header
int header = 0x03;
if(dynamic == Qtrue) header |= 0x02 << 2;
if(externalizable == Qtrue) header |= 0x01 << 2;
header |= ((int)members_len) << 4;
ser_write_int(ser, header);
// Write class name
ser3_write_utf8vr(ser, class_name);
// Write out members
for(i = 0; i < members_len; i++) {
ser3_write_utf8vr(ser, RARRAY_PTR(members)[i]);
}
}
// Raise exception if marked externalizable
if(externalizable == Qtrue) {
rb_funcall(obj, rb_intern("write_external"), 1, self);
return;
}
// Make a request for props hash unless we already have it
if(props == Qnil) {
props = rb_funcall(ser->class_mapper, id_props_for_serialization, 1, obj);
}
// Write sealed members
VALUE skipped_members = members_len ? rb_hash_new() : Qnil;
for(i = 0; i < members_len; i++) {
ser3_serialize(self, rb_hash_aref(props, RARRAY_PTR(members)[i]));
rb_hash_aset(skipped_members, RARRAY_PTR(members)[i], Qtrue);
}
// Write dynamic properties
if(dynamic == Qtrue) {
VALUE args[2] = {self, skipped_members};
#ifdef SORT_PROPS
// Sort is required prior to Ruby 1.9 to pass all the tests, as Ruby 1.8 hashes don't store insert order
VALUE sorted_props = rb_funcall(props, rb_intern("sort"), 0);
for(i = 0; i < RARRAY_LEN(sorted_props); i++) {
VALUE pair = RARRAY_PTR(sorted_props)[i];
ser3_hash_iter(RARRAY_PTR(pair)[0], RARRAY_PTR(pair)[1], args);
}
#else
rb_hash_foreach(props, ser3_hash_iter, (st_data_t)args);
#endif
ser_write_byte(ser, AMF3_CLOSE_DYNAMIC_OBJECT);
}
}
static void ser3_write_time(VALUE self, VALUE time_obj) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser_write_byte(ser, AMF3_DATE_MARKER);
// Write object ref, or cache it
VALUE obj_index;
if(st_lookup(ser->obj_cache, time_obj, &obj_index)) {
ser_write_int(ser, FIX2INT(obj_index) << 1);
return;
} else {
st_add_direct(ser->obj_cache, time_obj, LONG2FIX(ser->obj_index));
ser->obj_index++;
}
// Write time
ser_write_byte(ser, AMF3_NULL_MARKER); // Ref header
time_obj = rb_obj_dup(time_obj);
rb_funcall(time_obj, id_utc, 0);
double tmp_num = NUM2DBL(rb_funcall(time_obj, id_to_f, 0)) * 1000;
ser_write_double(ser, tmp_num);
}
static void ser3_write_date(VALUE self, VALUE date) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser_write_byte(ser, AMF3_DATE_MARKER);
// Write object ref, or cache it
VALUE obj_index;
if(st_lookup(ser->obj_cache, date, &obj_index)) {
ser_write_int(ser, FIX2INT(obj_index) << 1);
return;
} else {
st_add_direct(ser->obj_cache, date, LONG2FIX(ser->obj_index));
ser->obj_index++;
}
// Write time
ser_write_byte(ser, AMF3_NULL_MARKER); // Ref header
double tmp_num = rb_str_to_dbl(rb_funcall(date, rb_intern("strftime"), 1, rb_str_new2("%Q")), Qfalse);
ser_write_double(ser, tmp_num);
}
static void ser3_write_byte_array(VALUE self, VALUE ba) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser_write_byte(ser, AMF3_BYTE_ARRAY_MARKER);
// Write object ref, or cache it
VALUE obj_index;
if(st_lookup(ser->obj_cache, ba, &obj_index)) {
ser_write_int(ser, FIX2INT(obj_index) << 1);
return;
} else {
st_add_direct(ser->obj_cache, ba, LONG2FIX(ser->obj_index));
ser->obj_index++;
}
// Write byte array
VALUE str = rb_funcall(ba, rb_intern("string"), 0);
int len = (int)(RSTRING_LEN(str) << 1); // Explicitly cast to int to avoid compiler warning
ser_write_int(ser, len | 1);
rb_str_buf_cat(ser->stream, RSTRING_PTR(str), RSTRING_LEN(str));
}
/*
* Serializes the object to a string and returns that string
*/
static VALUE ser3_serialize(VALUE self, VALUE obj) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
int type = TYPE(obj);
VALUE klass = Qnil;
if(type == T_OBJECT || type == T_DATA || type == T_ARRAY) {
klass = CLASS_OF(obj);
}
if(rb_respond_to(obj, id_encode_amf)) {
rb_funcall(obj, id_encode_amf, 1, self);
} else if(type == T_STRING || type == T_SYMBOL) {
ser_write_byte(ser, AMF3_STRING_MARKER);
ser3_write_utf8vr(ser, obj);
} else if(rb_obj_is_kind_of(obj, rb_cNumeric)) {
ser3_write_numeric(ser, obj);
} else if(type == T_NIL) {
ser_write_byte(ser, AMF3_NULL_MARKER);
} else if(type == T_TRUE) {
ser_write_byte(ser, AMF3_TRUE_MARKER);
} else if(type == T_FALSE) {
ser_write_byte(ser, AMF3_FALSE_MARKER);
} else if(type == T_ARRAY) {
ser3_write_array(self, obj);
} else if(type == T_HASH) {
ser3_write_object(self, obj, Qnil, Qnil);
} else if(klass == rb_cTime) {
ser3_write_time(self, obj);
} else if(klass == cDate || klass == cDateTime) {
ser3_write_date(self, obj);
} else if(klass == cStringIO) {
ser3_write_byte_array(self, obj);
} else if(type == T_OBJECT) {
ser3_write_object(self, obj, Qnil, Qnil);
}
return ser->stream;
}
/*
* Mark ruby objects for GC
*/
static void ser_mark(AMF_SERIALIZER *ser) {
if(!ser) return;
rb_gc_mark(ser->class_mapper);
rb_gc_mark(ser->stream);
}
/*
* Free cache tables, stream and the struct itself
*/
int ser_free_strtable_key(st_data_t key, st_data_t value, st_data_t ignored)
{
xfree((void *)key);
return ST_DELETE;
}
static inline void ser_free_cache(AMF_SERIALIZER *ser) {
if(ser->str_cache) {
st_foreach(ser->str_cache, ser_free_strtable_key, 0);
st_free_table(ser->str_cache);
ser->str_cache = NULL;
}
if(ser->trait_cache) {
st_foreach(ser->trait_cache, ser_free_strtable_key, 0);
st_free_table(ser->trait_cache);
ser->trait_cache = NULL;
}
if(ser->obj_cache) {
st_free_table(ser->obj_cache);
ser->obj_cache = NULL;
}
}
static void ser_free(AMF_SERIALIZER *ser) {
ser_free_cache(ser);
xfree(ser);
}
/*
* Create new struct and wrap with class
*/
static VALUE ser_alloc(VALUE klass) {
// Allocate struct
AMF_SERIALIZER *ser = ALLOC(AMF_SERIALIZER);
memset(ser, 0, sizeof(AMF_SERIALIZER));
return Data_Wrap_Struct(klass, ser_mark, ser_free, ser);
}
/*
* Initializer
*/
static VALUE ser_initialize(VALUE self, VALUE class_mapper) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
ser->class_mapper = class_mapper;
ser->depth = 0;
ser->stream = rb_str_buf_new(0);
return self;
}
/*
* call-seq:
* ser.version => int
*
* Returns the serializer version number, so that a custom encode_amf method
* knows which version to encode for
*/
static VALUE ser_version(VALUE self) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
return INT2FIX(ser->version);
}
/*
* call-seq:
* ser.stream => string
*
* Returns the string that the serializer is writing to
*/
static VALUE ser_stream(VALUE self) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
return ser->stream;
}
/*
* call-seq:
* ser.serialize(amf_ver, obj) => string
*
* Serialize the given object to the current stream and returns the stream
*/
VALUE ser_serialize(VALUE self, VALUE ver, VALUE obj) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
// Process version
int int_ver = FIX2INT(ver);
if(int_ver != 0 && int_ver != 3) rb_raise(rb_eArgError, "unsupported version %d", int_ver);
ser->version = int_ver;
// Initialize caches
if(ser->depth == 0) {
ser->obj_cache = st_init_numtable();
ser->obj_index = 0;
if(ser->version == 3) {
ser->str_cache = st_init_strtable();
ser->str_index = 0;
ser->trait_cache = st_init_strtable();
ser->trait_index = 0;
}
}
ser->depth++;
// Perform serialization
if(ser->version == 0) {
ser0_serialize(self, obj);
} else {
ser3_serialize(self, obj);
}
// Clean up
ser->depth--;
if(ser->depth == 0) ser_free_cache(ser);
return ser->stream;
}
/*
* call-seq:
* ser.write_array(ary) => ser
*
* Serializes the given array to the serializer stream
*/
static VALUE ser_write_array(VALUE self, VALUE ary) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
if(ser->version == 0) {
ser0_write_array(self, ary);
} else {
ser3_write_array(self, ary);
}
return self;
}
/*
* call-seq:
* ser.write_object(obj, props=nil) => ser
* ser.write_object(obj, props=nil, traits=nil) => ser
*
* Serializes the given object or hash to the serializer stream using
* the proper serializer version. If given a props hash, uses that
* instead of using the class mapper to calculate it. If given a traits
* hash for AMF3, uses that instead of the default dynamic traits with
* the mapped class name.
*/
static VALUE ser_write_object(int argc, VALUE *argv, VALUE self) {
AMF_SERIALIZER *ser;
Data_Get_Struct(self, AMF_SERIALIZER, ser);
// Check args and call implementation
VALUE obj;
VALUE props = Qnil;
VALUE traits = Qnil;
if(ser->version == 0) {
rb_scan_args(argc, argv, "11", &obj, &props);
ser0_write_object(self, obj, props);
} else {
rb_scan_args(argc, argv, "12", &obj, &props, &traits);
ser3_write_object(self, obj, props, traits);
}
return self;
}
void Init_rocket_amf_serializer() {
// Define Serializer
cSerializer = rb_define_class_under(mRocketAMFExt, "Serializer", rb_cObject);
rb_define_alloc_func(cSerializer, ser_alloc);
rb_define_method(cSerializer, "initialize", ser_initialize, 1);
rb_define_method(cSerializer, "version", ser_version, 0);
rb_define_method(cSerializer, "stream", ser_stream, 0);
rb_define_method(cSerializer, "serialize", ser_serialize, 2);
rb_define_method(cSerializer, "write_array", ser_write_array, 1);
rb_define_method(cSerializer, "write_object", ser_write_object, -1);
// Get refs to commonly used symbols and ids
id_haskey = rb_intern("has_key?");
id_encode_amf = rb_intern("encode_amf");
id_is_array_collection = rb_intern("is_array_collection?");
id_use_array_collection = rb_intern("use_array_collection");
id_get_as_class_name = rb_intern("get_as_class_name");
id_props_for_serialization = rb_intern("props_for_serialization");
id_utc = rb_intern("utc");
id_to_f = rb_intern("to_f");
id_is_integer = rb_intern("integer?");
}

View file

@ -0,0 +1,29 @@
#include <ruby.h>
#ifdef HAVE_RB_STR_ENCODE
#include <ruby/st.h>
#include <ruby/encoding.h>
#else
#include <st.h>
#endif
typedef struct {
int version;
VALUE class_mapper;
VALUE stream;
long depth;
st_table* str_cache;
long str_index;
st_table* trait_cache;
long trait_index;
st_table* obj_cache;
long obj_index;
} AMF_SERIALIZER;
void ser_write_byte(AMF_SERIALIZER *ser, char byte);
void ser_write_int(AMF_SERIALIZER *ser, int num);
void ser_write_uint16(AMF_SERIALIZER *ser, long num);
void ser_write_uint32(AMF_SERIALIZER *ser, long num);
void ser_write_double(AMF_SERIALIZER *ser, double num);
void ser_get_string(VALUE obj, VALUE encode, char** str, long* len);
VALUE ser_serialize(VALUE self, VALUE ver, VALUE obj);

View file

@ -0,0 +1,4 @@
// Before RFLOAT_VALUE, value was in a different place in the struct
#ifndef RFLOAT_VALUE
#define RFLOAT_VALUE(v) (RFLOAT(v)->value)
#endif

View file

@ -0,0 +1,216 @@
$:.unshift(File.dirname(__FILE__)) unless $:.include?(File.dirname(__FILE__)) || $:.include?(File.expand_path(File.dirname(__FILE__)))
$:.unshift "#{File.expand_path(File.dirname(__FILE__))}/rocketamf/"
require "date"
require "stringio"
require 'rocketamf/extensions'
require 'rocketamf/class_mapping'
require 'rocketamf/constants'
require 'rocketamf/remoting'
# RocketAMF is a full featured AMF0/3 serializer and deserializer with support for
# bi-directional Flash to Ruby class mapping, custom serialization and mapping,
# remoting gateway helpers that follow AMF0/3 messaging specs, and a suite of specs
# to ensure adherence to the specification documents put out by Adobe. If the C
# components compile, then RocketAMF automatically takes advantage of them to
# provide a substantial performance benefit. In addition, RocketAMF is fully
# compatible with Ruby 1.9.
#
# == Performance
#
# RocketAMF provides native C extensions for serialization, deserialization,
# remoting, and class mapping. If your environment supports them, RocketAMF will
# automatically take advantage of the C serializer, deserializer, and remoting
# support. The C class mapper has some substantial performance optimizations that
# make it incompatible with the pure Ruby class mapper, and so it must be manually
# enabled. For more information see <tt>RocketAMF::ClassMapping</tt>. Below are
# some benchmarks I took using using a simple little benchmarking utility I whipped
# up, which can be found in the root of the repository.
#
# # 100000 objects
# # Ruby 1.8
# Testing native AMF0:
# minimum serialize time: 1.229868s
# minimum deserialize time: 0.86465s
# Testing native AMF3:
# minimum serialize time: 1.444652s
# minimum deserialize time: 0.879407s
# Testing pure AMF0:
# minimum serialize time: 25.427931s
# minimum deserialize time: 11.706084s
# Testing pure AMF3:
# minimum serialize time: 31.637864s
# minimum deserialize time: 14.773969s
#
# == Serialization & Deserialization
#
# RocketAMF provides two main methods - <tt>serialize</tt> and <tt>deserialize</tt>.
# Deserialization takes a String or StringIO object and the AMF version if different
# from the default. Serialization takes any Ruby object and the version if different
# from the default. Both default to AMF0, as it's more widely supported and slightly
# faster, but AMF3 does a better job of not sending duplicate data. Which you choose
# depends on what you need to communicate with and how much serialized size matters.
#
# == Mapping Classes Between Flash and Ruby
#
# RocketAMF provides a simple class mapping tool to facilitate serialization and
# deserialization of typed objects. Refer to the documentation of
# <tt>RocketAMF::ClassMapping</tt> for more details. If the provided class
# mapping tool is not sufficient for your needs, you also have the option to
# replace it with a class mapper of your own devising that matches the documented
# API.
#
# == Remoting
#
# You can use RocketAMF bare to write an AMF gateway using the following code.
# In addition, you can use rack-amf (http://github.com/rubyamf/rack-amf) or
# RubyAMF (http://github.com/rubyamf/rubyamf), both of which provide rack-compliant
# AMF gateways.
#
# # helloworld.ru
# require 'rocketamf'
#
# class HelloWorldApp
# APPLICATION_AMF = 'application/x-amf'.freeze
#
# def call env
# if is_amf?(env)
# # Wrap request and response
# env['rack.input'].rewind
# request = RocketAMF::Envelope.new.populate_from_stream(env['rack.input'].read)
# response = RocketAMF::Envelope.new
#
# # Handle request
# response.each_method_call request do |method, args|
# raise "Service #{method} does not exists" unless method == 'App.helloWorld'
# 'Hello world'
# end
#
# # Pass back response
# response_str = response.serialize
# return [200, {'Content-Type' => APPLICATION_AMF, 'Content-Length' => response_str.length.to_s}, [response_str]]
# else
# return [200, {'Content-Type' => 'text/plain', 'Content-Length' => '16' }, ["Rack AMF gateway"]]
# end
# end
#
# private
# def is_amf? env
# return false unless env['CONTENT_TYPE'] == APPLICATION_AMF
# return false unless env['PATH_INFO'] == '/amf'
# return true
# end
# end
#
# run HelloWorldApp.new
#
# == Advanced Serialization (encode_amf and IExternalizable)
#
# RocketAMF provides some additional functionality to support advanced
# serialization techniques. If you define an <tt>encode_amf</tt> method on your
# object, it will get called during serialization. It is passed a single argument,
# the serializer, and it can use the serializer stream, the <tt>serialize</tt>
# method, the <tt>write_array</tt> method, the <tt>write_object</tt> method, and
# the serializer version. Below is a simple example that uses <tt>write_object</tt>
# to customize the property hash that is used for serialization.
#
# Example:
#
# class TestObject
# def encode_amf ser
# ser.write_object self, @attributes
# end
# end
#
# If you plan on using the <tt>serialize</tt> method, make sure to pass in the
# current serializer version, or you could create a message that cannot be deserialized.
#
# Example:
#
# class VariableObject
# def encode_amf ser
# if ser.version == 0
# ser.serialize 0, true
# else
# ser.serialize 3, false
# end
# end
# end
#
# If you wish to send and receive IExternalizable objects, you will need to
# implement <tt>encode_amf</tt>, <tt>read_external</tt>, and <tt>write_external</tt>.
# Below is an example of a ResultSet class that extends Array and serializes as
# an array collection. RocketAMF can automatically serialize arrays as
# ArrayCollection objects, so this is just an example of how you might implement
# an object that conforms to IExternalizable.
#
# Example:
#
# class ResultSet < Array
# def encode_amf ser
# if ser.version == 0
# # Serialize as simple array in AMF0
# ser.write_array self
# else
# # Serialize as an ArrayCollection object
# # It conforms to IExternalizable, does not have any dynamic properties,
# # and has no "sealed" members. See the AMF3 specs for more details about
# # object traits.
# ser.write_object self, nil, {
# :class_name => "flex.messaging.io.ArrayCollection",
# :externalizable => true,
# :dynamic => false,
# :members => []
# }
# end
# end
#
# # Write self as array to stream
# def write_external ser
# ser.write_array(self)
# end
#
# # Read array out and replace data with deserialized array.
# def read_external des
# replace(des.read_object)
# end
# end
module RocketAMF
begin
require 'rocketamf/ext'
rescue LoadError
require 'rocketamf/pure'
end
# Deserialize the AMF string _source_ of the given AMF version into a Ruby
# data structure and return it. Creates an instance of <tt>RocketAMF::Deserializer</tt>
# with a new instance of <tt>RocketAMF::ClassMapper</tt> and calls deserialize
# on it with the given source and amf version, returning the result.
def self.deserialize source, amf_version = 0
des = RocketAMF::Deserializer.new(RocketAMF::ClassMapper.new)
des.deserialize(amf_version, source)
end
# Serialize the given Ruby data structure _obj_ into an AMF stream using the
# given AMF version. Creates an instance of <tt>RocketAMF::Serializer</tt>
# with a new instance of <tt>RocketAMF::ClassMapper</tt> and calls serialize
# on it with the given object and amf version, returning the result.
def self.serialize obj, amf_version = 0
ser = RocketAMF::Serializer.new(RocketAMF::ClassMapper.new)
ser.serialize(amf_version, obj)
end
# We use const_missing to define the active ClassMapper at runtime. This way,
# heavy modification of class mapping functionality is still possible without
# forcing extenders to redefine the constant.
def self.const_missing const #:nodoc:
if const == :ClassMapper
RocketAMF.const_set(:ClassMapper, RocketAMF::ClassMapping)
else
super(const)
end
end
# The base exception for AMF errors.
class AMFError < StandardError; end
end

View file

@ -0,0 +1,237 @@
require 'rocketamf/values/typed_hash'
require 'rocketamf/values/messages'
module RocketAMF
# Container for all mapped classes
class MappingSet
# Creates a mapping set object and populates the default mappings
def initialize
@as_mappings = {}
@ruby_mappings = {}
map_defaults
end
# Adds required mapping configs, calling map for the required base mappings.
# Designed to allow extenders to take advantage of required default mappings.
def map_defaults
map :as => 'flex.messaging.messages.AbstractMessage', :ruby => 'RocketAMF::Values::AbstractMessage'
map :as => 'flex.messaging.messages.RemotingMessage', :ruby => 'RocketAMF::Values::RemotingMessage'
map :as => 'flex.messaging.messages.AsyncMessage', :ruby => 'RocketAMF::Values::AsyncMessage'
map :as => 'DSA', :ruby => 'RocketAMF::Values::AsyncMessageExt'
map :as => 'flex.messaging.messages.CommandMessage', :ruby => 'RocketAMF::Values::CommandMessage'
map :as => 'DSC', :ruby => 'RocketAMF::Values::CommandMessageExt'
map :as => 'flex.messaging.messages.AcknowledgeMessage', :ruby => 'RocketAMF::Values::AcknowledgeMessage'
map :as => 'DSK', :ruby => 'RocketAMF::Values::AcknowledgeMessageExt'
map :as => 'flex.messaging.messages.ErrorMessage', :ruby => 'RocketAMF::Values::ErrorMessage'
self
end
# Map a given AS class to a ruby class.
#
# Use fully qualified names for both.
#
# Example:
#
# m.map :as => 'com.example.Date', :ruby => 'Example::Date'
def map params
[:as, :ruby].each {|k| params[k] = params[k].to_s} # Convert params to strings
@as_mappings[params[:as]] = params[:ruby]
@ruby_mappings[params[:ruby]] = params[:as]
end
# Returns the AS class name for the given ruby class name, returing nil if
# not found
def get_as_class_name class_name #:nodoc:
@ruby_mappings[class_name.to_s]
end
# Returns the ruby class name for the given AS class name, returing nil if
# not found
def get_ruby_class_name class_name #:nodoc:
@as_mappings[class_name.to_s]
end
end
# Handles class name mapping between actionscript and ruby and assists in
# serializing and deserializing data between them. Simply map an AS class to a
# ruby class and when the object is (de)serialized it will end up as the
# appropriate class.
#
# Example:
#
# RocketAMF::ClassMapper.define do |m|
# m.map :as => 'AsClass', :ruby => 'RubyClass'
# m.map :as => 'vo.User', :ruby => 'Model::User'
# end
#
# == Object Population/Serialization
#
# In addition to handling class name mapping, it also provides helper methods
# for populating ruby objects from AMF and extracting properties from ruby objects
# for serialization. Support for hash-like objects and objects using
# <tt>attr_accessor</tt> for properties is currently built in, but custom classes
# may require subclassing the class mapper to add support.
#
# == Complete Replacement
#
# In some cases, it may be beneficial to replace the default provider of class
# mapping completely. In this case, simply assign your class mapper class to
# <tt>RocketAMF::ClassMapper</tt> after loading RocketAMF. Through the magic of
# <tt>const_missing</tt>, <tt>ClassMapper</tt> is only defined after the first
# access by default, so you get no annoying warning messages. Custom class mappers
# must implement the following methods on instances: <tt>use_array_collection</tt>,
# <tt>get_as_class_name</tt>, <tt>get_ruby_obj</tt>, <tt>populate_ruby_obj</tt>,
# and <tt>props_for_serialization</tt>. In addition, it should have a class level
# <tt>mappings</tt> method that returns the mapping set it's using, although its
# not required. If you'd like to see an example of what complete replacement
# offers, check out RubyAMF (http://github.com/rubyamf/rubyamf).
#
# Example:
#
# require 'rubygems'
# require 'rocketamf'
#
# RocketAMF::ClassMapper = MyCustomClassMapper
# # No warning about already initialized constant ClassMapper
# RocketAMF::ClassMapper # MyCustomClassMapper
#
# == C ClassMapper
#
# The C class mapper, <tt>RocketAMF::Ext::FastClassMapping</tt>, has the same
# public API that <tt>RubyAMF::ClassMapping</tt> does, but has some additional
# performance optimizations that may interfere with the proper serialization of
# objects. To reduce the cost of processing public methods for every object,
# its implementation of <tt>props_for_serialization</tt> caches valid properties
# by class, using the class as the hash key for property lookup. This means that
# adding and removing properties from instances while serializing using a given
# class mapper instance will result in the changes not being detected. As such,
# it's not enabled by default. So long as you aren't planning on modifying
# classes during serialization using <tt>encode_amf</tt>, the faster C class
# mapper should be perfectly safe to use.
#
# Activating the C Class Mapper:
#
# require 'rubygems'
# require 'rocketamf'
# RocketAMF::ClassMapper = RocketAMF::Ext::FastClassMapping
class ClassMapping
class << self
# Global configuration variable for sending Arrays as ArrayCollections.
# Defaults to false.
attr_accessor :use_array_collection
# Returns the mapping set with all the class mappings that is currently
# being used.
def mappings
@mappings ||= MappingSet.new
end
# Define class mappings in the block. Block is passed a <tt>MappingSet</tt> object
# as the first parameter.
#
# Example:
#
# RocketAMF::ClassMapper.define do |m|
# m.map :as => 'AsClass', :ruby => 'RubyClass'
# end
def define &block #:yields: mapping_set
yield mappings
end
# Reset all class mappings except the defaults and return
# <tt>use_array_collection</tt> to false
def reset
@use_array_collection = false
@mappings = nil
end
end
attr_reader :use_array_collection
# Copies configuration from class level configs to populate object
def initialize
@mappings = self.class.mappings
@use_array_collection = self.class.use_array_collection === true
end
# Returns the ActionScript class name for the given ruby object. Will also
# take a string containing the ruby class name.
def get_as_class_name obj
# Get class name
if obj.is_a?(String)
ruby_class_name = obj
elsif obj.is_a?(Values::TypedHash)
ruby_class_name = obj.type
elsif obj.is_a?(Hash)
return nil
else
ruby_class_name = obj.class.name
end
# Get mapped AS class name
@mappings.get_as_class_name ruby_class_name
end
# Instantiates a ruby object using the mapping configuration based on the
# source ActionScript class name. If there is no mapping defined, it returns
# a <tt>RocketAMF::Values::TypedHash</tt> with the serialized class name.
def get_ruby_obj as_class_name
ruby_class_name = @mappings.get_ruby_class_name as_class_name
if ruby_class_name.nil?
# Populate a simple hash, since no mapping
return Values::TypedHash.new(as_class_name)
else
ruby_class = ruby_class_name.split('::').inject(Kernel) {|scope, const_name| scope.const_get(const_name)}
return ruby_class.new
end
end
# Populates the ruby object using the given properties. props and
# dynamic_props will be hashes with symbols for keys.
def populate_ruby_obj obj, props, dynamic_props=nil
props.merge! dynamic_props if dynamic_props
# Don't even bother checking if it responds to setter methods if it's a TypedHash
if obj.is_a?(Values::TypedHash)
obj.merge! props
return obj
end
# Some type of object
hash_like = obj.respond_to?("[]=")
props.each do |key, value|
if obj.respond_to?("#{key}=")
obj.send("#{key}=", value)
elsif hash_like
obj[key] = value
end
end
obj
end
# Extracts all exportable properties from the given ruby object and returns
# them in a hash. If overriding, make sure to return a hash wth string keys
# unless you are only going to be using the native C extensions, as the pure
# ruby serializer performs a sort on the keys to acheive consistent, testable
# results.
def props_for_serialization ruby_obj
# Handle hashes
if ruby_obj.is_a?(Hash)
# Stringify keys to make it easier later on and allow sorting
h = {}
ruby_obj.each {|k,v| h[k.to_s] = v}
return h
end
# Generic object serializer
props = {}
@ignored_props ||= Object.new.public_methods
(ruby_obj.public_methods - @ignored_props).each do |method_name|
# Add them to the prop hash if they take no arguments
method_def = ruby_obj.method(method_name)
props[method_name.to_s] = ruby_obj.send(method_name) if method_def.arity == 0
end
props
end
end
end

View file

@ -0,0 +1,50 @@
module RocketAMF
# AMF0 Type Markers
AMF0_NUMBER_MARKER = 0x00 #"\000"
AMF0_BOOLEAN_MARKER = 0x01 #"\001"
AMF0_STRING_MARKER = 0x02 #"\002"
AMF0_OBJECT_MARKER = 0x03 #"\003"
AMF0_MOVIE_CLIP_MARKER = 0x04 #"\004" # Unused
AMF0_NULL_MARKER = 0x05 #"\005"
AMF0_UNDEFINED_MARKER = 0x06 #"\006"
AMF0_REFERENCE_MARKER = 0x07 #"\a"
AMF0_HASH_MARKER = 0x08 #"\b"
AMF0_OBJECT_END_MARKER = 0x09 #"\t"
AMF0_STRICT_ARRAY_MARKER = 0x0A #"\n"
AMF0_DATE_MARKER = 0x0B #"\v"
AMF0_LONG_STRING_MARKER = 0x0C #"\f"
AMF0_UNSUPPORTED_MARKER = 0x0D #"\r"
AMF0_RECORDSET_MARKER = 0x0E #"\016" # Unused
AMF0_XML_MARKER = 0x0F #"\017"
AMF0_TYPED_OBJECT_MARKER = 0x10 #"\020"
AMF0_AMF3_MARKER = 0x11 #"\021"
# AMF3 Type Markers
AMF3_UNDEFINED_MARKER = 0x00 #"\000"
AMF3_NULL_MARKER = 0x01 #"\001"
AMF3_FALSE_MARKER = 0x02 #"\002"
AMF3_TRUE_MARKER = 0x03 #"\003"
AMF3_INTEGER_MARKER = 0x04 #"\004"
AMF3_DOUBLE_MARKER = 0x05 #"\005"
AMF3_STRING_MARKER = 0x06 #"\006"
AMF3_XML_DOC_MARKER = 0x07 #"\a"
AMF3_DATE_MARKER = 0x08 #"\b"
AMF3_ARRAY_MARKER = 0x09 #"\t"
AMF3_OBJECT_MARKER = 0x0A #"\n"
AMF3_XML_MARKER = 0x0B #"\v"
AMF3_BYTE_ARRAY_MARKER = 0x0C #"\f"
AMF3_VECTOR_INT_MARKER = 0x0D #"\r"
AMF3_VECTOR_UINT_MARKER = 0x0E #"\016"
AMF3_VECTOR_DOUBLE_MARKER = 0x0F #"\017"
AMF3_VECTOR_OBJECT_MARKER = 0x10 #"\020"
AMF3_DICT_MARKER = 0x11 #"\021"
# Other AMF3 Markers
AMF3_EMPTY_STRING = 0x01
AMF3_CLOSE_DYNAMIC_OBJECT = 0x01
AMF3_CLOSE_DYNAMIC_ARRAY = 0x01
# Other Constants
MAX_INTEGER = 268435455
MIN_INTEGER = -268435456
end

View file

@ -0,0 +1,28 @@
begin
# Fat binaries for Windows
RUBY_VERSION =~ /(\d+.\d+)/
require "#{$1}/rocketamf_ext"
rescue LoadError
require "rocketamf_ext"
end
module RocketAMF
# This module holds all the modules/classes that implement AMF's functionality
# in C
module Ext
$DEBUG and warn "Using C library for RocketAMF."
end
#:stopdoc:
# Import serializer/deserializer
Deserializer = RocketAMF::Ext::Deserializer
Serializer = RocketAMF::Ext::Serializer
# Modify envelope so it can serialize/deserialize
class Envelope
remove_method :populate_from_stream
remove_method :serialize
include RocketAMF::Ext::Envelope
end
#:startdoc:
end

View file

@ -0,0 +1,22 @@
# Joc's monkeypatch for string bytesize (only available in 1.8.7+)
if !"amf".respond_to? :bytesize
class String #:nodoc:
def bytesize
self.size
end
end
end
# Add <tt>ArrayCollection</tt> override to arrays
class Array
# Override <tt>RocketAMF::ClassMapper.use_array_collection</tt> setting for
# this array. Adds <tt>is_array_collection?</tt> method, which is used by the
# serializer over the global config if defined.
def is_array_collection= a
@is_array_collection = a
def self.is_array_collection? #:nodoc:
@is_array_collection
end
end
end

View file

@ -0,0 +1,24 @@
require 'rocketamf/pure/deserializer'
require 'rocketamf/pure/serializer'
require 'rocketamf/pure/remoting'
module RocketAMF
# This module holds all the modules/classes that implement AMF's functionality
# in pure ruby
module Pure
$DEBUG and warn "Using pure library for RocketAMF."
end
#:stopdoc:
# Import serializer/deserializer
Deserializer = RocketAMF::Pure::Deserializer
Serializer = RocketAMF::Pure::Serializer
# Modify envelope so it can serialize/deserialize
class Envelope
remove_method :populate_from_stream
remove_method :serialize
include RocketAMF::Pure::Envelope
end
#:startdoc:
end

View file

@ -0,0 +1,455 @@
require 'rocketamf/pure/io_helpers'
module RocketAMF
module Pure
# Pure ruby deserializer for AMF0 and AMF3
class Deserializer
attr_accessor :source
# Pass in the class mapper instance to use when deserializing. This
# enables better caching behavior in the class mapper and allows
# one to change mappings between deserialization attempts.
def initialize class_mapper
@class_mapper = class_mapper
end
# Deserialize the source using AMF0 or AMF3. Source should either
# be a string or StringIO object. If you pass a StringIO object,
# it will have its position updated to the end of the deserialized
# data.
def deserialize version, source
raise ArgumentError, "unsupported version #{version}" unless [0,3].include?(version)
@version = version
if StringIO === source
@source = source
elsif source
@source = StringIO.new(source)
elsif @source.nil?
raise AMFError, "no source to deserialize"
end
if @version == 0
@ref_cache = []
return amf0_deserialize
else
@string_cache = []
@object_cache = []
@trait_cache = []
return amf3_deserialize
end
end
# Reads an object from the deserializer's stream and returns it.
def read_object
if @version == 0
return amf0_deserialize
else
return amf3_deserialize
end
end
private
include RocketAMF::Pure::ReadIOHelpers
def amf0_deserialize type=nil
type = read_int8 @source unless type
case type
when AMF0_NUMBER_MARKER
amf0_read_number
when AMF0_BOOLEAN_MARKER
amf0_read_boolean
when AMF0_STRING_MARKER
amf0_read_string
when AMF0_OBJECT_MARKER
amf0_read_object
when AMF0_NULL_MARKER
nil
when AMF0_UNDEFINED_MARKER
nil
when AMF0_REFERENCE_MARKER
amf0_read_reference
when AMF0_HASH_MARKER
amf0_read_hash
when AMF0_STRICT_ARRAY_MARKER
amf0_read_array
when AMF0_DATE_MARKER
amf0_read_date
when AMF0_LONG_STRING_MARKER
amf0_read_string true
when AMF0_UNSUPPORTED_MARKER
nil
when AMF0_XML_MARKER
amf0_read_string true
when AMF0_TYPED_OBJECT_MARKER
amf0_read_typed_object
when AMF0_AMF3_MARKER
deserialize(3, nil)
else
raise AMFError, "Invalid type: #{type}"
end
end
def amf0_read_number
res = read_double @source
(res.is_a?(Float) && res.nan?) ? nil : res # check for NaN and convert them to nil
end
def amf0_read_boolean
read_int8(@source) != 0
end
def amf0_read_string long=false
len = long ? read_word32_network(@source) : read_word16_network(@source)
str = @source.read(len)
str.force_encoding("UTF-8") if str.respond_to?(:force_encoding)
str
end
def amf0_read_reference
index = read_word16_network(@source)
@ref_cache[index]
end
def amf0_read_array
len = read_word32_network(@source)
array = []
@ref_cache << array
0.upto(len - 1) do
array << amf0_deserialize
end
array
end
def amf0_read_date
seconds = read_double(@source).to_f/1000
time = Time.at(seconds)
tz = read_word16_network(@source) # Unused
time
end
def amf0_read_props obj={}
while true
key = amf0_read_string
type = read_int8 @source
break if type == AMF0_OBJECT_END_MARKER
obj[key] = amf0_deserialize(type)
end
obj
end
def amf0_read_hash
len = read_word32_network(@source) # Read and ignore length
obj = {}
@ref_cache << obj
amf0_read_props obj
end
def amf0_read_object add_to_ref_cache=true
# Create "object" and add to ref cache (it's always a Hash)
obj = @class_mapper.get_ruby_obj ""
@ref_cache << obj
# Populate object
props = amf0_read_props
@class_mapper.populate_ruby_obj obj, props
return obj
end
def amf0_read_typed_object
# Create object to add to ref cache
class_name = amf0_read_string
obj = @class_mapper.get_ruby_obj class_name
@ref_cache << obj
# Populate object
props = amf0_read_props
@class_mapper.populate_ruby_obj obj, props
return obj
end
def amf3_deserialize
type = read_int8 @source
case type
when AMF3_UNDEFINED_MARKER
nil
when AMF3_NULL_MARKER
nil
when AMF3_FALSE_MARKER
false
when AMF3_TRUE_MARKER
true
when AMF3_INTEGER_MARKER
amf3_read_integer
when AMF3_DOUBLE_MARKER
amf3_read_number
when AMF3_STRING_MARKER
amf3_read_string
when AMF3_XML_DOC_MARKER, AMF3_XML_MARKER
amf3_read_xml
when AMF3_DATE_MARKER
amf3_read_date
when AMF3_ARRAY_MARKER
amf3_read_array
when AMF3_OBJECT_MARKER
amf3_read_object
when AMF3_BYTE_ARRAY_MARKER
amf3_read_byte_array
when AMF3_VECTOR_INT_MARKER, AMF3_VECTOR_UINT_MARKER, AMF3_VECTOR_DOUBLE_MARKER, AMF3_VECTOR_OBJECT_MARKER
amf3_read_vector type
when AMF3_DICT_MARKER
amf3_read_dict
else
raise AMFError, "Invalid type: #{type}"
end
end
def amf3_read_integer
n = 0
b = read_word8(@source) || 0
result = 0
while ((b & 0x80) != 0 && n < 3)
result = result << 7
result = result | (b & 0x7f)
b = read_word8(@source) || 0
n = n + 1
end
if (n < 3)
result = result << 7
result = result | b
else
#Use all 8 bits from the 4th byte
result = result << 8
result = result | b
#Check if the integer should be negative
if (result > MAX_INTEGER)
result -= (1 << 29)
end
end
result
end
def amf3_read_number
res = read_double @source
(res.is_a?(Float) && res.nan?) ? nil : res # check for NaN and convert them to nil
end
def amf3_read_string
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @string_cache[reference]
else
length = type >> 1
str = ""
if length > 0
str = @source.read(length)
str.force_encoding("UTF-8") if str.respond_to?(:force_encoding)
@string_cache << str
end
return str
end
end
def amf3_read_xml
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
length = type >> 1
str = ""
if length > 0
str = @source.read(length)
str.force_encoding("UTF-8") if str.respond_to?(:force_encoding)
@object_cache << str
end
return str
end
end
def amf3_read_byte_array
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
length = type >> 1
obj = StringIO.new @source.read(length)
@object_cache << obj
obj
end
end
def amf3_read_array
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
length = type >> 1
property_name = amf3_read_string
array = property_name.length > 0 ? {} : []
@object_cache << array
while property_name.length > 0
value = amf3_deserialize
array[property_name] = value
property_name = amf3_read_string
end
0.upto(length - 1) {|i| array[i] = amf3_deserialize }
array
end
end
def amf3_read_object
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
class_type = type >> 1
class_is_reference = (class_type & 0x01) == 0
if class_is_reference
reference = class_type >> 1
traits = @trait_cache[reference]
else
externalizable = (class_type & 0x02) != 0
dynamic = (class_type & 0x04) != 0
attribute_count = class_type >> 3
class_name = amf3_read_string
class_attributes = []
attribute_count.times{class_attributes << amf3_read_string} # Read class members
traits = {
:class_name => class_name,
:members => class_attributes,
:externalizable => externalizable,
:dynamic => dynamic
}
@trait_cache << traits
end
# Optimization for deserializing ArrayCollection
if traits[:class_name] == "flex.messaging.io.ArrayCollection"
arr = amf3_deserialize # Adds ArrayCollection array to object cache
@object_cache << arr # Add again for ArrayCollection source array
return arr
end
obj = @class_mapper.get_ruby_obj traits[:class_name]
@object_cache << obj
if traits[:externalizable]
obj.read_external self
else
props = {}
traits[:members].each do |key|
value = amf3_deserialize
props[key] = value
end
dynamic_props = nil
if traits[:dynamic]
dynamic_props = {}
while (key = amf3_read_string) && key.length != 0 do # read next key
value = amf3_deserialize
dynamic_props[key] = value
end
end
@class_mapper.populate_ruby_obj obj, props, dynamic_props
end
obj
end
end
def amf3_read_date
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
seconds = read_double(@source).to_f/1000
time = Time.at(seconds)
@object_cache << time
time
end
end
def amf3_read_dict
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
dict = {}
@object_cache << dict
length = type >> 1
weak_keys = read_int8 @source # Ignore: Not supported in ruby
0.upto(length - 1) do |i|
dict[amf3_deserialize] = amf3_deserialize
end
dict
end
end
def amf3_read_vector vector_type
type = amf3_read_integer
is_reference = (type & 0x01) == 0
if is_reference
reference = type >> 1
return @object_cache[reference]
else
vec = []
@object_cache << vec
length = type >> 1
fixed_vector = read_int8 @source # Ignore
case vector_type
when AMF3_VECTOR_INT_MARKER
0.upto(length - 1) do |i|
int = read_word32_network(@source)
int = int - 2**32 if int > MAX_INTEGER
vec << int
end
when AMF3_VECTOR_UINT_MARKER
0.upto(length - 1) do |i|
vec << read_word32_network(@source)
puts vec[i].to_s(2)
end
when AMF3_VECTOR_DOUBLE_MARKER
0.upto(length - 1) do |i|
vec << amf3_read_number
end
when AMF3_VECTOR_OBJECT_MARKER
vector_class = amf3_read_string # Ignore
puts vector_class
0.upto(length - 1) do |i|
vec << amf3_deserialize
end
end
vec
end
end
end
end
end

View file

@ -0,0 +1,94 @@
module RocketAMF
module Pure
module ReadIOHelpers #:nodoc:
def read_int8 source
source.read(1).unpack('c').first
end
def read_word8 source
source.read(1).unpack('C').first
end
def read_double source
source.read(8).unpack('G').first
end
def read_word16_network source
source.read(2).unpack('n').first
end
def read_int16_network source
str = source.read(2)
str.reverse! if byte_order_little? # swap bytes as native=little (and we want network)
str.unpack('s').first
end
def read_word32_network source
source.read(4).unpack('N').first
end
def byte_order
if [0x12345678].pack("L") == "\x12\x34\x56\x78"
:BigEndian
else
:LittleEndian
end
end
def byte_order_little?
(byte_order == :LittleEndian) ? true : false;
end
end
module WriteIOHelpers #:nodoc:
def pack_integer(integer)
integer = integer & 0x1fffffff
if(integer < 0x80)
[integer].pack('c')
elsif(integer < 0x4000)
[integer >> 7 & 0x7f | 0x80].pack('c')+
[integer & 0x7f].pack('c')
elsif(integer < 0x200000)
[integer >> 14 & 0x7f | 0x80].pack('c') +
[integer >> 7 & 0x7f | 0x80].pack('c') +
[integer & 0x7f].pack('c')
else
[integer >> 22 & 0x7f | 0x80].pack('c')+
[integer >> 15 & 0x7f | 0x80].pack('c')+
[integer >> 8 & 0x7f | 0x80].pack('c')+
[integer & 0xff].pack('c')
end
end
def pack_double(double)
[double].pack('G')
end
def pack_int8(val)
[val].pack('c')
end
def pack_int16_network(val)
[val].pack('n')
end
def pack_word32_network(val)
str = [val].pack('L')
str.reverse! if byte_order_little? # swap bytes as native=little (and we want network)
str
end
def byte_order
if [0x12345678].pack("L") == "\x12\x34\x56\x78"
:BigEndian
else
:LittleEndian
end
end
def byte_order_little?
(byte_order == :LittleEndian) ? true : false;
end
end
end
end

View file

@ -0,0 +1,117 @@
require 'rocketamf/pure/io_helpers'
module RocketAMF
module Pure
# Included into RocketAMF::Envelope, this module replaces the
# populate_from_stream and serialize methods with actual working versions
module Envelope
# Included into RocketAMF::Envelope, this method handles deserializing an
# AMF request/response into the envelope
def populate_from_stream stream, class_mapper=nil
stream = StringIO.new(stream) unless StringIO === stream
des = Deserializer.new(class_mapper || RocketAMF::ClassMapper.new)
des.source = stream
# Initialize
@amf_version = 0
@headers = {}
@messages = []
# Read AMF version
@amf_version = read_word16_network stream
# Read in headers
header_count = read_word16_network stream
0.upto(header_count-1) do
name = stream.read(read_word16_network(stream))
name.force_encoding("UTF-8") if name.respond_to?(:force_encoding)
must_understand = read_int8(stream) != 0
length = read_word32_network stream
data = des.deserialize(0, nil)
@headers[name] = RocketAMF::Header.new(name, must_understand, data)
end
# Read in messages
message_count = read_word16_network stream
0.upto(message_count-1) do
target_uri = stream.read(read_word16_network(stream))
target_uri.force_encoding("UTF-8") if target_uri.respond_to?(:force_encoding)
response_uri = stream.read(read_word16_network(stream))
response_uri.force_encoding("UTF-8") if response_uri.respond_to?(:force_encoding)
length = read_word32_network stream
data = des.deserialize(0, nil)
if data.is_a?(Array) && data.length == 1 && data[0].is_a?(::RocketAMF::Values::AbstractMessage)
data = data[0]
end
@messages << RocketAMF::Message.new(target_uri, response_uri, data)
end
self
end
# Included into RocketAMF::Envelope, this method handles serializing an
# AMF request/response into a string
def serialize class_mapper=nil
ser = Serializer.new(class_mapper || RocketAMF::ClassMapper.new)
stream = ser.stream
# Write version
stream << pack_int16_network(@amf_version)
# Write headers
stream << pack_int16_network(@headers.length) # Header count
@headers.each_value do |h|
# Write header name
name_str = h.name
name_str.encode!("UTF-8").force_encoding("ASCII-8BIT") if name_str.respond_to?(:encode)
stream << pack_int16_network(name_str.bytesize)
stream << name_str
# Write must understand flag
stream << pack_int8(h.must_understand ? 1 : 0)
# Serialize data
stream << pack_word32_network(-1) # length of data - -1 if you don't know
ser.serialize(0, h.data)
end
# Write messages
stream << pack_int16_network(@messages.length) # Message count
@messages.each do |m|
# Write target_uri
uri_str = m.target_uri
uri_str.encode!("UTF-8").force_encoding("ASCII-8BIT") if uri_str.respond_to?(:encode)
stream << pack_int16_network(uri_str.bytesize)
stream << uri_str
# Write response_uri
uri_str = m.response_uri
uri_str.encode!("UTF-8").force_encoding("ASCII-8BIT") if uri_str.respond_to?(:encode)
stream << pack_int16_network(uri_str.bytesize)
stream << uri_str
# Serialize data
stream << pack_word32_network(-1) # length of data - -1 if you don't know
if @amf_version == 3
stream << AMF0_AMF3_MARKER
ser.serialize(3, m.data)
else
ser.serialize(0, m.data)
end
end
stream
end
private
include RocketAMF::Pure::ReadIOHelpers
include RocketAMF::Pure::WriteIOHelpers
end
end
end

View file

@ -0,0 +1,474 @@
require 'rocketamf/pure/io_helpers'
module RocketAMF
module Pure
# Pure ruby serializer for AMF0 and AMF3
class Serializer
attr_reader :stream, :version
# Pass in the class mapper instance to use when serializing. This enables
# better caching behavior in the class mapper and allows one to change
# mappings between serialization attempts.
def initialize class_mapper
@class_mapper = class_mapper
@stream = ""
@depth = 0
end
# Serialize the given object using AMF0 or AMF3. Can be called from inside
# encode_amf, but make sure to pass in the proper version or it may not be
# possible to decode. Use the serializer version attribute for this.
def serialize version, obj
raise ArgumentError, "unsupported version #{version}" unless [0,3].include?(version)
@version = version
# Initialize caches
if @depth == 0
if @version == 0
@ref_cache = SerializerCache.new :object
else
@string_cache = SerializerCache.new :string
@object_cache = SerializerCache.new :object
@trait_cache = SerializerCache.new :string
end
end
@depth += 1
# Perform serialization
if @version == 0
amf0_serialize(obj)
else
amf3_serialize(obj)
end
# Cleanup
@depth -= 1
if @depth == 0
@ref_cache = nil
@string_cache = nil
@object_cache = nil
@trait_cache = nil
end
return @stream
end
# Helper for writing arrays inside encode_amf. It uses the current AMF
# version to write the array.
def write_array arr
if @version == 0
amf0_write_array arr
else
amf3_write_array arr
end
end
# Helper for writing objects inside encode_amf. It uses the current AMF
# version to write the object. If you pass in a property hash, it will use
# it rather than having the class mapper determine properties. For AMF3,
# you can also specify a traits hash, which can be used to reduce serialized
# data size or serialize things as externalizable.
def write_object obj, props=nil, traits=nil
if @version == 0
amf0_write_object obj, props
else
amf3_write_object obj, props, traits
end
end
private
include RocketAMF::Pure::WriteIOHelpers
def amf0_serialize obj
if @ref_cache[obj] != nil
amf0_write_reference @ref_cache[obj]
elsif obj.respond_to?(:encode_amf)
obj.encode_amf(self)
elsif obj.is_a?(NilClass)
amf0_write_null
elsif obj.is_a?(TrueClass) || obj.is_a?(FalseClass)
amf0_write_boolean obj
elsif obj.is_a?(Numeric)
amf0_write_number obj
elsif obj.is_a?(Symbol) || obj.is_a?(String)
amf0_write_string obj.to_s
elsif obj.is_a?(Time)
amf0_write_time obj
elsif obj.is_a?(Date)
amf0_write_date obj
elsif obj.is_a?(Array)
amf0_write_array obj
elsif obj.is_a?(Hash) ||obj.is_a?(Object)
amf0_write_object obj
end
end
def amf0_write_null
@stream << AMF0_NULL_MARKER
end
def amf0_write_boolean bool
@stream << AMF0_BOOLEAN_MARKER
@stream << pack_int8(bool ? 1 : 0)
end
def amf0_write_number num
@stream << AMF0_NUMBER_MARKER
@stream << pack_double(num)
end
def amf0_write_string str
str = str.encode("UTF-8").force_encoding("ASCII-8BIT") if str.respond_to?(:encode)
len = str.bytesize
if len > 2**16-1
@stream << AMF0_LONG_STRING_MARKER
@stream << pack_word32_network(len)
else
@stream << AMF0_STRING_MARKER
@stream << pack_int16_network(len)
end
@stream << str
end
def amf0_write_time time
@stream << AMF0_DATE_MARKER
time = time.getutc # Dup and convert to UTC
milli = (time.to_f * 1000).to_i
@stream << pack_double(milli)
@stream << pack_int16_network(0) # Time zone
end
def amf0_write_date date
@stream << AMF0_DATE_MARKER
@stream << pack_double(date.strftime("%Q").to_i)
@stream << pack_int16_network(0) # Time zone
end
def amf0_write_reference index
@stream << AMF0_REFERENCE_MARKER
@stream << pack_int16_network(index)
end
def amf0_write_array array
@ref_cache.add_obj array
@stream << AMF0_STRICT_ARRAY_MARKER
@stream << pack_word32_network(array.length)
array.each do |elem|
amf0_serialize elem
end
end
def amf0_write_object obj, props=nil
@ref_cache.add_obj obj
props = @class_mapper.props_for_serialization obj if props.nil?
# Is it a typed object?
class_name = @class_mapper.get_as_class_name obj
if class_name
class_name = class_name.encode("UTF-8").force_encoding("ASCII-8BIT") if class_name.respond_to?(:encode)
@stream << AMF0_TYPED_OBJECT_MARKER
@stream << pack_int16_network(class_name.bytesize)
@stream << class_name
else
@stream << AMF0_OBJECT_MARKER
end
# Write prop list
props.sort.each do |key, value| # Sort keys before writing
key = key.encode("UTF-8").force_encoding("ASCII-8BIT") if key.respond_to?(:encode)
@stream << pack_int16_network(key.bytesize)
@stream << key
amf0_serialize value
end
# Write end
@stream << pack_int16_network(0)
@stream << AMF0_OBJECT_END_MARKER
end
def amf3_serialize obj
if obj.respond_to?(:encode_amf)
obj.encode_amf(self)
elsif obj.is_a?(NilClass)
amf3_write_null
elsif obj.is_a?(TrueClass)
amf3_write_true
elsif obj.is_a?(FalseClass)
amf3_write_false
elsif obj.is_a?(Numeric)
amf3_write_numeric obj
elsif obj.is_a?(Symbol) || obj.is_a?(String)
amf3_write_string obj.to_s
elsif obj.is_a?(Time)
amf3_write_time obj
elsif obj.is_a?(Date)
amf3_write_date obj
elsif obj.is_a?(StringIO)
amf3_write_byte_array obj
elsif obj.is_a?(Array)
amf3_write_array obj
elsif obj.is_a?(Hash) || obj.is_a?(Object)
amf3_write_object obj
end
end
def amf3_write_reference index
header = index << 1 # shift value left to leave a low bit of 0
@stream << pack_integer(header)
end
def amf3_write_null
@stream << AMF3_NULL_MARKER
end
def amf3_write_true
@stream << AMF3_TRUE_MARKER
end
def amf3_write_false
@stream << AMF3_FALSE_MARKER
end
def amf3_write_numeric num
if !num.integer? || num < MIN_INTEGER || num > MAX_INTEGER # Check valid range for 29 bits
@stream << AMF3_DOUBLE_MARKER
@stream << pack_double(num)
else
@stream << AMF3_INTEGER_MARKER
@stream << pack_integer(num)
end
end
def amf3_write_string str
@stream << AMF3_STRING_MARKER
amf3_write_utf8_vr str
end
def amf3_write_time time
@stream << AMF3_DATE_MARKER
if @object_cache[time] != nil
amf3_write_reference @object_cache[time]
else
# Cache time
@object_cache.add_obj time
# Build AMF string
time = time.getutc # Dup and convert to UTC
milli = (time.to_f * 1000).to_i
@stream << AMF3_NULL_MARKER
@stream << pack_double(milli)
end
end
def amf3_write_date date
@stream << AMF3_DATE_MARKER
if @object_cache[date] != nil
amf3_write_reference @object_cache[date]
else
# Cache date
@object_cache.add_obj date
# Build AMF string
@stream << AMF3_NULL_MARKER
@stream << pack_double(date.strftime("%Q").to_i)
end
end
def amf3_write_byte_array array
@stream << AMF3_BYTE_ARRAY_MARKER
if @object_cache[array] != nil
amf3_write_reference @object_cache[array]
else
@object_cache.add_obj array
str = array.string
@stream << pack_integer(str.bytesize << 1 | 1)
@stream << str
end
end
def amf3_write_array array
# Is it an array collection?
is_ac = false
if array.respond_to?(:is_array_collection?)
is_ac = array.is_array_collection?
else
is_ac = @class_mapper.use_array_collection
end
# Write type marker
@stream << (is_ac ? AMF3_OBJECT_MARKER : AMF3_ARRAY_MARKER)
# Write reference or cache array
if @object_cache[array] != nil
amf3_write_reference @object_cache[array]
return
else
@object_cache.add_obj array
@object_cache.add_obj nil if is_ac # The array collection source array
end
# Write out traits and array marker if it's an array collection
if is_ac
class_name = "flex.messaging.io.ArrayCollection"
if @trait_cache[class_name] != nil
@stream << pack_integer(@trait_cache[class_name] << 2 | 0x01)
else
@trait_cache.add_obj class_name
@stream << "\a" # Externalizable, non-dynamic
amf3_write_utf8_vr(class_name)
end
@stream << AMF3_ARRAY_MARKER
end
# Build AMF string for array
header = array.length << 1 # make room for a low bit of 1
header = header | 1 # set the low bit to 1
@stream << pack_integer(header)
@stream << AMF3_CLOSE_DYNAMIC_ARRAY
array.each do |elem|
amf3_serialize elem
end
end
def amf3_write_object obj, props=nil, traits=nil
@stream << AMF3_OBJECT_MARKER
# Caching...
if @object_cache[obj] != nil
amf3_write_reference @object_cache[obj]
return
end
@object_cache.add_obj obj
# Calculate traits if not given
is_default = false
if traits.nil?
traits = {
:class_name => @class_mapper.get_as_class_name(obj),
:members => [],
:externalizable => false,
:dynamic => true
}
is_default = true unless traits[:class_name]
end
class_name = is_default ? "__default__" : traits[:class_name]
# Write out traits
if (class_name && @trait_cache[class_name] != nil)
@stream << pack_integer(@trait_cache[class_name] << 2 | 0x01)
else
@trait_cache.add_obj class_name if class_name
# Write out trait header
header = 0x03 # Not object ref and not trait ref
header |= 0x02 << 2 if traits[:dynamic]
header |= 0x01 << 2 if traits[:externalizable]
header |= traits[:members].length << 4
@stream << pack_integer(header)
# Write out class name
if class_name == "__default__"
amf3_write_utf8_vr("")
else
amf3_write_utf8_vr(class_name.to_s)
end
# Write out members
traits[:members].each {|m| amf3_write_utf8_vr(m)}
end
# If externalizable, take externalized data shortcut
if traits[:externalizable]
obj.write_external(self)
return
end
# Extract properties if not given
props = @class_mapper.props_for_serialization(obj) if props.nil?
# Write out sealed properties
traits[:members].each do |m|
amf3_serialize props[m]
props.delete(m)
end
# Write out dynamic properties
if traits[:dynamic]
# Write out dynamic properties
props.sort.each do |key, val| # Sort props until Ruby 1.9 becomes common
amf3_write_utf8_vr key.to_s
amf3_serialize val
end
# Write close
@stream << AMF3_CLOSE_DYNAMIC_OBJECT
end
end
def amf3_write_utf8_vr str, encode=true
if str.respond_to?(:encode)
if encode
str = str.encode("UTF-8")
else
str = str.dup if str.frozen?
end
str.force_encoding("ASCII-8BIT")
end
if str == ''
@stream << AMF3_EMPTY_STRING
elsif @string_cache[str] != nil
amf3_write_reference @string_cache[str]
else
# Cache string
@string_cache.add_obj str
# Build AMF string
@stream << pack_integer(str.bytesize << 1 | 1)
@stream << str
end
end
end
class SerializerCache #:nodoc:
def self.new type
if type == :string
StringCache.new
elsif type == :object
ObjectCache.new
end
end
class StringCache < Hash #:nodoc:
def initialize
@cache_index = 0
end
def add_obj str
self[str] = @cache_index
@cache_index += 1
end
end
class ObjectCache < Hash #:nodoc:
def initialize
@cache_index = 0
@obj_references = []
end
def [] obj
super(obj.object_id)
end
def add_obj obj
@obj_references << obj
self[obj.object_id] = @cache_index
@cache_index += 1
end
end
end
end
end

View file

@ -0,0 +1,196 @@
module RocketAMF
# Container for the AMF request/response.
class Envelope
attr_reader :amf_version, :headers, :messages
def initialize props={}
@amf_version = props[:amf_version] || 0
@headers = props[:headers] || {}
@messages = props[:messages] || []
end
# Populates the envelope from the given stream or string using the given
# class mapper, or creates a new one. Returns self for easy chaining.
#
# Example:
#
# req = RocketAMF::Envelope.new.populate_from_stream(env['rack.input'].read)
#--
# Implemented in pure/remoting.rb RocketAMF::Pure::Envelope
def populate_from_stream stream, class_mapper=nil
raise AMFError, 'Must load "rocketamf/pure"'
end
# Creates the appropriate message and adds it to <tt>messages</tt> to call
# the given target using the standard (old) remoting APIs. You can call multiple
# targets in the same request, unlike with the flex remotings APIs.
#
# Example:
#
# req = RocketAMF::Envelope.new
# req.call 'test', "arg_1", ["args", "args"]
# req.call 'Controller.action'
def call target, *args
raise "Cannot use different call types" unless @call_type.nil? || @call_type == :simple
@call_type = :simple
msg_num = messages.length+1
@messages << RocketAMF::Message.new(target, "/#{msg_num}", args)
end
# Creates the appropriate message and adds it to <tt>messages</tt> using the
# new flex (RemoteObject) remoting APIs. You can only make one flex remoting
# call per envelope, and the AMF version must be set to 3.
#
# Example:
#
# req = RocketAMF::Envelope.new :amf_version => 3
# req.call_flex 'Controller.action', "arg_1", ["args", "args"]
def call_flex target, *args
raise "Can only call one flex target per request" if @call_type == :flex
raise "Cannot use different call types" if @call_type == :simple
raise "Cannot use flex remoting calls with AMF0" if @amf_version != 3
@call_type = :flex
flex_msg = RocketAMF::Values::RemotingMessage.new
target_parts = target.split(".")
flex_msg.operation = target_parts.pop # Use pop so that a missing source is possible without issues
flex_msg.source = target_parts.pop
flex_msg.body = args
@messages << RocketAMF::Message.new('null', '/2', flex_msg) # /2 because it always sends a command message before
end
# Serializes the envelope to a string using the given class mapper, or creates
# a new one, and returns the result
#--
# Implemented in pure/remoting.rb RocketAMF::Pure::Envelope
def serialize class_mapper=nil
raise AMFError, 'Must load "rocketamf/pure"'
end
# Builds response from the request, iterating over each method call and using
# the return value as the method call's return value. Marks as envelope as
# constructed after running.
#--
# Iterate over all the sent messages. If they're somthing we can handle, like
# a command message, then simply add the response message ourselves. If it's
# a method call, then call the block with the method and args, catching errors
# for handling. Then create the appropriate response message using the return
# value of the block as the return value for the method call.
def each_method_call request, &block
raise 'Response already constructed' if @constructed
# Set version from response
# Can't just copy version because FMS sends version as 1
@amf_version = request.amf_version == 3 ? 3 : 0
request.messages.each do |m|
# What's the request body?
case m.data
when Values::CommandMessage
# Pings should be responded to with an AcknowledgeMessage built using the ping
# Everything else is unsupported
command_msg = m.data
if command_msg.operation == Values::CommandMessage::CLIENT_PING_OPERATION
response_value = Values::AcknowledgeMessage.new(command_msg)
else
e = Exception.new("CommandMessage #{command_msg.operation} not implemented")
e.set_backtrace ["RocketAMF::Envelope each_method_call"]
response_value = Values::ErrorMessage.new(command_msg, e)
end
when Values::RemotingMessage
# Using RemoteObject style message calls
remoting_msg = m.data
acknowledge_msg = Values::AcknowledgeMessage.new(remoting_msg)
method_base = remoting_msg.source.to_s.empty? ? '' : remoting_msg.source+'.'
body = dispatch_call :method => method_base+remoting_msg.operation, :args => remoting_msg.body, :source => remoting_msg, :block => block
# Response should be the bare ErrorMessage if there was an error
if body.is_a?(Values::ErrorMessage)
response_value = body
else
acknowledge_msg.body = body
response_value = acknowledge_msg
end
else
# Standard response message
response_value = dispatch_call :method => m.target_uri, :args => m.data, :source => m, :block => block
end
target_uri = m.response_uri
target_uri += response_value.is_a?(Values::ErrorMessage) ? '/onStatus' : '/onResult'
@messages << ::RocketAMF::Message.new(target_uri, '', response_value)
end
@constructed = true
end
# Returns the result of a response envelope, or an array of results if there
# are multiple action call messages. It automatically unwraps flex-style
# RemoteObject response messages, where the response result is inside a
# RocketAMF::Values::AcknowledgeMessage.
#
# Example:
#
# req = RocketAMF::Envelope.new
# req.call('TestController.test', 'first_arg', 'second_arg')
# res = RocketAMF::Envelope.new
# res.each_method_call req do |method, args|
# ['a', 'b']
# end
# res.result #=> ['a', 'b']
def result
results = []
messages.each do |msg|
if msg.data.is_a?(Values::AcknowledgeMessage)
results << msg.data.body
else
results << msg.data
end
end
results.length > 1 ? results : results[0]
end
# Whether or not the response has been constructed. Can be used to prevent
# serialization when no processing has taken place.
def constructed?
@constructed
end
# Return the serialized envelope as a string
def to_s
serialize
end
def dispatch_call p #:nodoc:
begin
p[:block].call(p[:method], p[:args])
rescue Exception => e
# Create ErrorMessage object using the source message as the base
Values::ErrorMessage.new(p[:source], e)
end
end
end
# RocketAMF::Envelope header
class Header
attr_accessor :name, :must_understand, :data
def initialize name, must_understand, data
@name = name
@must_understand = must_understand
@data = data
end
end
# RocketAMF::Envelope message
class Message
attr_accessor :target_uri, :response_uri, :data
def initialize target_uri, response_uri, data
@target_uri = target_uri
@response_uri = response_uri
@data = data
end
end
end

View file

@ -0,0 +1,214 @@
module RocketAMF
module Values #:nodoc:
# Base class for all special AS3 response messages. Maps to
# <tt>flex.messaging.messages.AbstractMessage</tt>.
class AbstractMessage
EXTERNALIZABLE_FIELDS = [
%w[ body clientId destination headers messageId timestamp timeToLive ],
%w[ clientIdBytes messageIdBytes ]
]
attr_accessor :clientId
attr_accessor :destination
attr_accessor :messageId
attr_accessor :timestamp
attr_accessor :timeToLive
attr_accessor :headers
attr_accessor :body
def clientIdBytes= bytes
@clientId = pretty_uuid(bytes) unless bytes.nil?
end
def messageIdBytes= bytes
@messageId = pretty_uuid(bytes) unless bytes.nil?
end
def read_external des
read_external_fields des, EXTERNALIZABLE_FIELDS
end
private
def rand_uuid
[8,4,4,4,12].map {|n| rand_hex_3(n)}.join('-').to_s
end
def rand_hex_3(l)
"%0#{l}x" % rand(1 << l*4)
end
def pretty_uuid bytes
"%08x-%04x-%04x-%04x-%08x%04x" % bytes.string.unpack("NnnnNn")
end
def read_external_fields des, fields
# Read flags
flags = []
loop do
flags << des.source.read(1).unpack('C').first
break if flags.last < 128
end
# Read fields and any remaining unmapped fields in a byte-set
fields.each_with_index do |list, i|
break if flags[i].nil?
list.each_with_index do |name, j|
if flags[i] & 2**j != 0
send("#{name}=", des.read_object)
end
end
# Read remaining flags even though we don't recognize them
# Zero out high bit, as it's the has-next-field marker
f = (flags[i] & ~128) >> list.length
while f > 0
des.read_object if (f & 1) != 0
f >>= 1
end
end
end
end
# Maps to <tt>flex.messaging.messages.RemotingMessage</tt>
class RemotingMessage < AbstractMessage
# The name of the service to be called including package name
attr_accessor :source
# The name of the method to be called
attr_accessor :operation
def initialize
@clientId = nil
@destination = nil
@messageId = rand_uuid
@timestamp = Time.new.to_i*100
@timeToLive = 0
@headers = {}
@body = nil
end
end
# Maps to <tt>flex.messaging.messages.AsyncMessage</tt>
class AsyncMessage < AbstractMessage
EXTERNALIZABLE_FIELDS = [
%w[ correlationId correlationIdBytes]
]
attr_accessor :correlationId
def correlationIdBytes= bytes
@correlationId = pretty_uuid(bytes) unless bytes.nil?
end
def read_external des
super des
read_external_fields des, EXTERNALIZABLE_FIELDS
end
end
class AsyncMessageExt < AsyncMessage #:nodoc:
end
# Maps to <tt>flex.messaging.messages.CommandMessage</tt>
class CommandMessage < AsyncMessage
SUBSCRIBE_OPERATION = 0
UNSUSBSCRIBE_OPERATION = 1
POLL_OPERATION = 2
CLIENT_SYNC_OPERATION = 4
CLIENT_PING_OPERATION = 5
CLUSTER_REQUEST_OPERATION = 7
LOGIN_OPERATION = 8
LOGOUT_OPERATION = 9
SESSION_INVALIDATE_OPERATION = 10
MULTI_SUBSCRIBE_OPERATION = 11
DISCONNECT_OPERATION = 12
UNKNOWN_OPERATION = 10000
EXTERNALIZABLE_FIELDS = [
%w[ operation ]
]
attr_accessor :operation
def initialize
@operation = UNKNOWN_OPERATION
end
def read_external des
super des
read_external_fields des, EXTERNALIZABLE_FIELDS
end
end
class CommandMessageExt < CommandMessage #:nodoc:
end
# Maps to <tt>flex.messaging.messages.AcknowledgeMessage</tt>
class AcknowledgeMessage < AsyncMessage
EXTERNALIZABLE_FIELDS = [[]]
def initialize message=nil
@clientId = rand_uuid
@destination = nil
@messageId = rand_uuid
@timestamp = Time.new.to_i*100
@timeToLive = 0
@headers = {}
@body = nil
if message.is_a?(AbstractMessage)
@correlationId = message.messageId
end
end
def read_external des
super des
read_external_fields des, EXTERNALIZABLE_FIELDS
end
end
class AcknowledgeMessageExt < AcknowledgeMessage #:nodoc:
end
# Maps to <tt>flex.messaging.messages.ErrorMessage</tt> in AMF3 mode
class ErrorMessage < AcknowledgeMessage
# Extended data that will facilitate custom error processing on the client
attr_accessor :extendedData
# The fault code for the error, which defaults to the class name of the
# causing exception
attr_accessor :faultCode
# Detailed description of what caused the error
attr_accessor :faultDetail
# A simple description of the error
attr_accessor :faultString
# Optional "root cause" of the error
attr_accessor :rootCause
def initialize message=nil, exception=nil
super message
unless exception.nil?
@e = exception
@faultCode = @e.class.name
@faultDetail = @e.backtrace.join("\n")
@faultString = @e.message
end
end
def encode_amf serializer
if serializer.version == 0
data = {
:faultCode => @faultCode,
:faultDetail => @faultDetail,
:faultString => @faultString
}
serializer.write_object(data)
else
serializer.write_object(self)
end
end
end
end
end

View file

@ -0,0 +1,13 @@
module RocketAMF
module Values #:nodoc:
# Hash-like object that can store a type string. Used to preserve type information
# for unmapped objects after deserialization.
class TypedHash < Hash
attr_reader :type
def initialize type
@type = type
end
end
end
end

View file

@ -0,0 +1,110 @@
require "spec_helper.rb"
describe RocketAMF::ClassMapping do
before :each do
RocketAMF::ClassMapping.reset
RocketAMF::ClassMapping.define do |m|
m.map :as => 'ASClass', :ruby => 'ClassMappingTest'
end
@mapper = RocketAMF::ClassMapping.new
end
describe "class name mapping" do
it "should allow resetting of mappings back to defaults" do
@mapper.get_as_class_name('ClassMappingTest').should_not be_nil
RocketAMF::ClassMapping.reset
@mapper = RocketAMF::ClassMapping.new
@mapper.get_as_class_name('ClassMappingTest').should be_nil
@mapper.get_as_class_name('RocketAMF::Values::AcknowledgeMessage').should_not be_nil
end
it "should return AS class name for ruby objects" do
@mapper.get_as_class_name(ClassMappingTest.new).should == 'ASClass'
@mapper.get_as_class_name('ClassMappingTest').should == 'ASClass'
@mapper.get_as_class_name(RocketAMF::Values::TypedHash.new('ClassMappingTest')).should == 'ASClass'
@mapper.get_as_class_name('BadClass').should be_nil
end
it "should instantiate a ruby class" do
@mapper.get_ruby_obj('ASClass').should be_a(ClassMappingTest)
end
it "should properly instantiate namespaced classes" do
RocketAMF::ClassMapping.mappings.map :as => 'ASClass', :ruby => 'ANamespace::TestRubyClass'
@mapper = RocketAMF::ClassMapping.new
@mapper.get_ruby_obj('ASClass').should be_a(ANamespace::TestRubyClass)
end
it "should return a hash with original type if not mapped" do
obj = @mapper.get_ruby_obj('UnmappedClass')
obj.should be_a(RocketAMF::Values::TypedHash)
obj.type.should == 'UnmappedClass'
end
it "should map special classes from AS by default" do
as_classes = [
'flex.messaging.messages.AcknowledgeMessage',
'flex.messaging.messages.CommandMessage',
'flex.messaging.messages.RemotingMessage'
]
as_classes.each do |as_class|
@mapper.get_ruby_obj(as_class).should_not be_a(RocketAMF::Values::TypedHash)
end
end
it "should map special classes from ruby by default" do
ruby_classes = [
'RocketAMF::Values::AcknowledgeMessage',
'RocketAMF::Values::ErrorMessage'
]
ruby_classes.each do |obj|
@mapper.get_as_class_name(obj).should_not be_nil
end
end
it "should allow config modification" do
RocketAMF::ClassMapping.mappings.map :as => 'SecondClass', :ruby => 'ClassMappingTest'
@mapper = RocketAMF::ClassMapping.new
@mapper.get_as_class_name(ClassMappingTest.new).should == 'SecondClass'
end
end
describe "ruby object populator" do
it "should populate a ruby class" do
obj = @mapper.populate_ruby_obj ClassMappingTest.new, {:prop_a => 'Data'}
obj.prop_a.should == 'Data'
end
it "should populate a typed hash" do
obj = @mapper.populate_ruby_obj RocketAMF::Values::TypedHash.new('UnmappedClass'), {:prop_a => 'Data'}
obj[:prop_a].should == 'Data'
end
end
describe "property extractor" do
it "should extract hash properties" do
hash = {:a => 'test1', 'b' => 'test2'}
props = @mapper.props_for_serialization(hash)
props.should == {'a' => 'test1', 'b' => 'test2'}
end
it "should extract object properties" do
obj = ClassMappingTest.new
obj.prop_a = 'Test A'
hash = @mapper.props_for_serialization obj
hash.should == {'prop_a' => 'Test A', 'prop_b' => nil}
end
it "should extract inherited object properties" do
obj = ClassMappingTest2.new
obj.prop_a = 'Test A'
obj.prop_c = 'Test C'
hash = @mapper.props_for_serialization obj
hash.should == {'prop_a' => 'Test A', 'prop_b' => nil, 'prop_c' => 'Test C'}
end
end
end

View file

@ -0,0 +1,455 @@
# encoding: UTF-8
require "spec_helper.rb"
describe "when deserializing" do
before :each do
RocketAMF::ClassMapper.reset
end
it "should raise exception with invalid version number" do
lambda {
RocketAMF.deserialize("", 5)
}.should raise_error("unsupported version 5")
end
describe "AMF0" do
it "should update source pos if source is a StringIO object" do
input = StringIO.new(object_fixture('amf0-number.bin'))
input.pos.should == 0
output = RocketAMF.deserialize(input, 0)
input.pos.should == 9
end
it "should deserialize numbers" do
input = object_fixture('amf0-number.bin')
output = RocketAMF.deserialize(input, 0)
output.should == 3.5
end
it "should deserialize booleans" do
input = object_fixture('amf0-boolean.bin')
output = RocketAMF.deserialize(input, 0)
output.should === true
end
it "should deserialize UTF8 strings" do
input = object_fixture('amf0-string.bin')
output = RocketAMF.deserialize(input, 0)
output.should == "this is a テスト"
end
it "should deserialize nulls" do
input = object_fixture('amf0-null.bin')
output = RocketAMF.deserialize(input, 0)
output.should == nil
end
it "should deserialize undefineds" do
input = object_fixture('amf0-undefined.bin')
output = RocketAMF.deserialize(input, 0)
output.should == nil
end
it "should deserialize hashes" do
input = object_fixture('amf0-hash.bin')
output = RocketAMF.deserialize(input, 0)
output.should == {'a' => 'b', 'c' => 'd'}
end
it "should deserialize hashes with empty string keys" do
input = object_fixture('amf0-empty-string-key-hash.bin')
output = RocketAMF.deserialize(input, 0)
output.should == {'a' => 'b', 'c' => 'd', '' => 'last'}
end
it "should deserialize arrays from flash player" do
# Even Array is serialized as a "hash"
input = object_fixture('amf0-ecma-ordinal-array.bin')
output = RocketAMF.deserialize(input, 0)
output.should == {'0' => 'a', '1' => 'b', '2' => 'c', '3' => 'd'}
end
it "should deserialize strict arrays" do
input = object_fixture('amf0-strict-array.bin')
output = RocketAMF.deserialize(input, 0)
output.should == ['a', 'b', 'c', 'd']
end
it "should deserialize dates" do
input = object_fixture('amf0-time.bin')
output = RocketAMF.deserialize(input, 0)
output.should == Time.utc(2003, 2, 13, 5)
end
it "should deserialize an XML document" do
input = object_fixture('amf0-xml-doc.bin')
output = RocketAMF.deserialize(input, 0)
output.should == '<parent><child prop="test" /></parent>'
end
it "should deserialize anonymous objects" do
input = object_fixture('amf0-object.bin')
output = RocketAMF.deserialize(input, 0)
output.should == {'foo' => 'baz', 'bar' => 3.14}
output.type.should == ""
end
it "should deserialize an unmapped object as a dynamic anonymous object" do
input = object_fixture("amf0-typed-object.bin")
output = RocketAMF.deserialize(input, 0)
output.type.should == 'org.amf.ASClass'
output.should == {'foo' => 'bar', 'baz' => nil}
end
it "should deserialize a mapped object as a mapped ruby class instance" do
RocketAMF::ClassMapper.define {|m| m.map :as => 'org.amf.ASClass', :ruby => 'RubyClass'}
input = object_fixture("amf0-typed-object.bin")
output = RocketAMF.deserialize(input, 0)
output.should be_a(RubyClass)
output.foo.should == 'bar'
output.baz.should == nil
end
it "should deserialize references properly" do
input = object_fixture('amf0-ref-test.bin')
output = RocketAMF.deserialize(input, 0)
output.length.should == 2
output["0"].should === output["1"]
end
end
describe "AMF3" do
it "should update source pos if source is a StringIO object" do
input = StringIO.new(object_fixture('amf3-null.bin'))
input.pos.should == 0
output = RocketAMF.deserialize(input, 3)
input.pos.should == 1
end
describe "simple messages" do
it "should deserialize a null" do
input = object_fixture("amf3-null.bin")
output = RocketAMF.deserialize(input, 3)
output.should == nil
end
it "should deserialize a false" do
input = object_fixture("amf3-false.bin")
output = RocketAMF.deserialize(input, 3)
output.should == false
end
it "should deserialize a true" do
input = object_fixture("amf3-true.bin")
output = RocketAMF.deserialize(input, 3)
output.should == true
end
it "should deserialize integers" do
input = object_fixture("amf3-max.bin")
output = RocketAMF.deserialize(input, 3)
output.should == RocketAMF::MAX_INTEGER
input = object_fixture("amf3-0.bin")
output = RocketAMF.deserialize(input, 3)
output.should == 0
input = object_fixture("amf3-min.bin")
output = RocketAMF.deserialize(input, 3)
output.should == RocketAMF::MIN_INTEGER
end
it "should deserialize large integers" do
input = object_fixture("amf3-large-max.bin")
output = RocketAMF.deserialize(input, 3)
output.should == RocketAMF::MAX_INTEGER + 1
input = object_fixture("amf3-large-min.bin")
output = RocketAMF.deserialize(input, 3)
output.should == RocketAMF::MIN_INTEGER - 1
end
it "should deserialize BigNums" do
input = object_fixture("amf3-bignum.bin")
output = RocketAMF.deserialize(input, 3)
output.should == 2**1000
end
it "should deserialize a simple string" do
input = object_fixture("amf3-string.bin")
output = RocketAMF.deserialize(input, 3)
output.should == "String . String"
end
it "should deserialize a symbol as a string" do
input = object_fixture("amf3-symbol.bin")
output = RocketAMF.deserialize(input, 3)
output.should == "foo"
end
it "should deserialize dates" do
input = object_fixture("amf3-date.bin")
output = RocketAMF.deserialize(input, 3)
output.should == Time.at(0)
end
it "should deserialize XML" do
# XMLDocument tag
input = object_fixture("amf3-xml-doc.bin")
output = RocketAMF.deserialize(input, 3)
output.should == '<parent><child prop="test" /></parent>'
# XML tag
input = object_fixture("amf3-xml.bin")
output = RocketAMF.deserialize(input, 3)
output.should == '<parent><child prop="test"/></parent>'
end
end
describe "objects" do
it "should deserialize an unmapped object as a dynamic anonymous object" do
input = object_fixture("amf3-dynamic-object.bin")
output = RocketAMF.deserialize(input, 3)
expected = {
'property_one' => 'foo',
'nil_property' => nil,
'another_public_property' => 'a_public_value'
}
output.should == expected
output.type.should == ""
end
it "should deserialize a mapped object as a mapped ruby class instance" do
RocketAMF::ClassMapper.define {|m| m.map :as => 'org.amf.ASClass', :ruby => 'RubyClass'}
input = object_fixture("amf3-typed-object.bin")
output = RocketAMF.deserialize(input, 3)
output.should be_a(RubyClass)
output.foo.should == 'bar'
output.baz.should == nil
end
it "should deserialize externalizable objects" do
RocketAMF::ClassMapper.define {|m| m.map :as => 'ExternalizableTest', :ruby => 'ExternalizableTest'}
input = object_fixture("amf3-externalizable.bin")
output = RocketAMF.deserialize(input, 3)
output.length.should == 2
output[0].one.should == 5
output[1].two.should == 5
end
it "should deserialize a hash as a dynamic anonymous object" do
input = object_fixture("amf3-hash.bin")
output = RocketAMF.deserialize(input, 3)
output.should == {'foo' => "bar", 'answer' => 42}
end
it "should deserialize an empty array" do
input = object_fixture("amf3-empty-array.bin")
output = RocketAMF.deserialize(input, 3)
output.should == []
end
it "should deserialize an array of primitives" do
input = object_fixture("amf3-primitive-array.bin")
output = RocketAMF.deserialize(input, 3)
output.should == [1,2,3,4,5]
end
it "should deserialize an associative array" do
input = object_fixture("amf3-associative-array.bin")
output = RocketAMF.deserialize(input, 3)
output.should == {0=>"bar1", 1=>"bar2", 2=>"bar3", "asdf"=>"fdsa", "foo"=>"bar", "42"=>"bar"}
end
it "should deserialize an array of mixed objects" do
input = object_fixture("amf3-mixed-array.bin")
output = RocketAMF.deserialize(input, 3)
h1 = {'foo_one' => "bar_one"}
h2 = {'foo_two' => ""}
so1 = {'foo_three' => 42}
output.should == [h1, h2, so1, {}, [h1, h2, so1], [], 42, "", [], "", {}, "bar_one", so1]
end
it "should deserialize an array collection as an array" do
input = object_fixture("amf3-array-collection.bin")
output = RocketAMF.deserialize(input, 3)
output.class.should == Array
output.should == ["foo", "bar"]
end
it "should deserialize a complex set of array collections" do
RocketAMF::ClassMapper.define {|m| m.map :as => 'org.amf.ASClass', :ruby => 'RubyClass'}
input = object_fixture('amf3-complex-array-collection.bin')
output = RocketAMF.deserialize(input, 3)
output[0].should == ["foo", "bar"]
output[1][0].should be_a(RubyClass)
output[1][1].should be_a(RubyClass)
output[2].should === output[1]
end
it "should deserialize a byte array" do
input = object_fixture("amf3-byte-array.bin")
output = RocketAMF.deserialize(input, 3)
output.should be_a(StringIO)
expected = "\000\003これtest\100"
expected.force_encoding("ASCII-8BIT") if expected.respond_to?(:force_encoding)
output.string.should == expected
end
it "should deserialize an empty dictionary" do
input = object_fixture("amf3-empty-dictionary.bin")
output = RocketAMF.deserialize(input, 3)
output.should == {}
end
it "should deserialize a dictionary" do
input = object_fixture("amf3-dictionary.bin")
output = RocketAMF.deserialize(input, 3)
keys = output.keys
keys.length.should == 2
obj_key, str_key = keys[0].is_a?(RocketAMF::Values::TypedHash) ? [keys[0], keys[1]] : [keys[1], keys[0]]
obj_key.type.should == 'org.amf.ASClass'
output[obj_key].should == "asdf2"
str_key.should == "bar"
output[str_key].should == "asdf1"
end
it "should deserialize Vector.<int>" do
input = object_fixture('amf3-vector-int.bin')
output = RocketAMF.deserialize(input, 3)
output.should == [4, -20, 12]
end
it "should deserialize Vector.<uint>" do
input = object_fixture('amf3-vector-uint.bin')
output = RocketAMF.deserialize(input, 3)
output.should == [4, 20, 12]
end
it "should deserialize Vector.<Number>" do
input = object_fixture('amf3-vector-double.bin')
output = RocketAMF.deserialize(input, 3)
output.should == [4.3, -20.6]
end
it "should deserialize Vector.<Object>" do
input = object_fixture('amf3-vector-object.bin')
output = RocketAMF.deserialize(input, 3)
output[0]['foo'].should == 'foo'
output[1].type.should == 'org.amf.ASClass'
output[2]['foo'].should == 'baz'
end
end
describe "and implementing the AMF Spec" do
it "should keep references of duplicate strings" do
input = object_fixture("amf3-string-ref.bin")
output = RocketAMF.deserialize(input, 3)
foo = "foo"
bar = "str"
output.should == [foo, bar, foo, bar, foo, {'str' => "foo"}]
end
it "should not reference the empty string" do
input = object_fixture("amf3-empty-string-ref.bin")
output = RocketAMF.deserialize(input, 3)
output.should == ["",""]
end
it "should keep references of duplicate dates" do
input = object_fixture("amf3-date-ref.bin")
output = RocketAMF.deserialize(input, 3)
output[0].should == Time.at(0)
output[0].should equal(output[1])
# Expected object:
# [DateTime.parse "1/1/1970", DateTime.parse "1/1/1970"]
end
it "should keep reference of duplicate objects" do
input = object_fixture("amf3-object-ref.bin")
output = RocketAMF.deserialize(input, 3)
obj1 = {'foo' => "bar"}
obj2 = {'foo' => obj1['foo']}
output.should == [[obj1, obj2], "bar", [obj1, obj2]]
end
it "should keep reference of duplicate object traits" do
RocketAMF::ClassMapper.define {|m| m.map :as => 'org.amf.ASClass', :ruby => 'RubyClass'}
input = object_fixture("amf3-trait-ref.bin")
output = RocketAMF.deserialize(input, 3)
output[0].foo.should == "foo"
output[1].foo.should == "bar"
end
it "should keep references of duplicate arrays" do
input = object_fixture("amf3-array-ref.bin")
output = RocketAMF.deserialize(input, 3)
a = [1,2,3]
b = %w{ a b c }
output.should == [a, b, a, b]
end
it "should not keep references of duplicate empty arrays unless the object_id matches" do
input = object_fixture("amf3-empty-array-ref.bin")
output = RocketAMF.deserialize(input, 3)
a = []
b = []
output.should == [a,b,a,b]
end
it "should keep references of duplicate XML and XMLDocuments" do
input = object_fixture("amf3-xml-ref.bin")
output = RocketAMF.deserialize(input, 3)
output.should == ['<parent><child prop="test"/></parent>', '<parent><child prop="test"/></parent>']
end
it "should keep references of duplicate byte arrays" do
input = object_fixture("amf3-byte-array-ref.bin")
output = RocketAMF.deserialize(input, 3)
output[0].object_id.should == output[1].object_id
output[0].string.should == "ASDF"
end
it "should deserialize a deep object graph with circular references" do
input = object_fixture("amf3-graph-member.bin")
output = RocketAMF.deserialize(input, 3)
output['children'][0]['parent'].should === output
output['parent'].should === nil
output['children'].length.should == 2
# Expected object:
# parent = Hash.new
# child1 = Hash.new
# child1[:parent] = parent
# child1[:children] = []
# child2 = Hash.new
# child2[:parent] = parent
# child2[:children] = []
# parent[:parent] = nil
# parent[:children] = [child1, child2]
end
end
end
end

View file

@ -0,0 +1,144 @@
require "spec_helper.rb"
describe RocketAMF::Ext::FastClassMapping do
before :each do
RocketAMF::Ext::FastClassMapping.reset
RocketAMF::Ext::FastClassMapping.define do |m|
m.map :as => 'ASClass', :ruby => 'ClassMappingTest'
end
@mapper = RocketAMF::Ext::FastClassMapping.new
end
describe "class name mapping" do
it "should allow resetting of mappings back to defaults" do
@mapper.get_as_class_name('ClassMappingTest').should_not be_nil
RocketAMF::Ext::FastClassMapping.reset
@mapper = RocketAMF::Ext::FastClassMapping.new
@mapper.get_as_class_name('ClassMappingTest').should be_nil
@mapper.get_as_class_name('RocketAMF::Values::AcknowledgeMessage').should_not be_nil
end
it "should return AS class name for ruby objects" do
@mapper.get_as_class_name(ClassMappingTest.new).should == 'ASClass'
@mapper.get_as_class_name('ClassMappingTest').should == 'ASClass'
@mapper.get_as_class_name(RocketAMF::Values::TypedHash.new('ClassMappingTest')).should == 'ASClass'
@mapper.get_as_class_name('BadClass').should be_nil
end
it "should instantiate a ruby class" do
@mapper.get_ruby_obj('ASClass').should be_a(ClassMappingTest)
end
it "should properly instantiate namespaced classes" do
RocketAMF::Ext::FastClassMapping.mappings.map :as => 'ASClass', :ruby => 'ANamespace::TestRubyClass'
@mapper = RocketAMF::Ext::FastClassMapping.new
@mapper.get_ruby_obj('ASClass').should be_a(ANamespace::TestRubyClass)
end
it "should return a hash with original type if not mapped" do
obj = @mapper.get_ruby_obj('UnmappedClass')
obj.should be_a(RocketAMF::Values::TypedHash)
obj.type.should == 'UnmappedClass'
end
it "should map special classes from AS by default" do
as_classes = [
'flex.messaging.messages.AcknowledgeMessage',
'flex.messaging.messages.CommandMessage',
'flex.messaging.messages.RemotingMessage'
]
as_classes.each do |as_class|
@mapper.get_ruby_obj(as_class).should_not be_a(RocketAMF::Values::TypedHash)
end
end
it "should map special classes from ruby by default" do
ruby_classes = [
'RocketAMF::Values::AcknowledgeMessage',
'RocketAMF::Values::ErrorMessage'
]
ruby_classes.each do |obj|
@mapper.get_as_class_name(obj).should_not be_nil
end
end
it "should allow config modification" do
RocketAMF::Ext::FastClassMapping.mappings.map :as => 'SecondClass', :ruby => 'ClassMappingTest'
@mapper = RocketAMF::Ext::FastClassMapping.new
@mapper.get_as_class_name(ClassMappingTest.new).should == 'SecondClass'
end
end
describe "ruby object populator" do
it "should populate a ruby class" do
obj = @mapper.populate_ruby_obj ClassMappingTest.new, {:prop_a => 'Data'}
obj.prop_a.should == 'Data'
end
it "should populate a typed hash" do
obj = @mapper.populate_ruby_obj RocketAMF::Values::TypedHash.new('UnmappedClass'), {'prop_a' => 'Data'}
obj['prop_a'].should == 'Data'
end
end
describe "property extractor" do
# Use symbol keys for properties in Ruby >1.9
def prop_hash hash
out = {}
if RUBY_VERSION =~ /^1\.8/
hash.each {|k,v| out[k.to_s] = v}
else
hash.each {|k,v| out[k.to_sym] = v}
end
out
end
it "should return hash without modification" do
hash = {:a => 'test1', 'b' => 'test2'}
props = @mapper.props_for_serialization(hash)
props.should === hash
end
it "should extract object properties" do
obj = ClassMappingTest.new
obj.prop_a = 'Test A'
hash = @mapper.props_for_serialization obj
hash.should == prop_hash({'prop_a' => 'Test A', 'prop_b' => nil})
end
it "should extract inherited object properties" do
obj = ClassMappingTest2.new
obj.prop_a = 'Test A'
obj.prop_c = 'Test C'
hash = @mapper.props_for_serialization obj
hash.should == prop_hash({'prop_a' => 'Test A', 'prop_b' => nil, 'prop_c' => 'Test C'})
end
it "should cache property lookups by instance" do
class ClassMappingTest3; attr_accessor :prop_a; end;
# Cache properties
obj = ClassMappingTest3.new
hash = @mapper.props_for_serialization obj
# Add a method to ClassMappingTest3
class ClassMappingTest3; attr_accessor :prop_b; end;
# Test property list does not have new property
obj = ClassMappingTest3.new
obj.prop_a = 'Test A'
obj.prop_b = 'Test B'
hash = @mapper.props_for_serialization obj
hash.should == prop_hash({'prop_a' => 'Test A'})
# Test that new class mapper *does* have new property (cache per instance)
@mapper = RocketAMF::Ext::FastClassMapping.new
hash = @mapper.props_for_serialization obj
hash.should == prop_hash({'prop_a' => 'Test A', 'prop_b' => 'Test B'})
end
end
end

View file

@ -0,0 +1 @@


Binary file not shown.

Binary file not shown.

View file

@ -0,0 +1 @@


Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View file

@ -0,0 +1 @@


Binary file not shown.

Binary file not shown.

View file

@ -0,0 +1,2 @@
Cflex.messaging.io.ArrayCollection foobar

View file

@ -0,0 +1 @@
  abc  

View file

@ -0,0 +1 @@
 asdf fdsafoobar42 bar1 bar2 bar3

Binary file not shown.

View file

@ -0,0 +1 @@
 ASDF 

View file

@ -0,0 +1,6 @@

Cflex.messaging.io.ArrayCollection foobar
 
#org.amf.ASClassbaz
 asdf


View file

@ -0,0 +1 @@
Shift テストUTF テスト

Binary file not shown.

Binary file not shown.

View file

@ -0,0 +1,2 @@
/another_public_propertya_public_valuenil_propertyproperty_onefoo

View file

@ -0,0 +1 @@
    

View file

@ -0,0 +1 @@


View file

@ -0,0 +1 @@


View file

@ -0,0 +1 @@


Some files were not shown because too many files have changed in this diff Show more