Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

typo fixes #30

Closed
wants to merge 34 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
6c58179
typo fixes
dsteinbrunner Aug 6, 2013
7b97eca
typo fix
dsteinbrunner Aug 6, 2013
c96cc65
typo fix
dsteinbrunner Aug 6, 2013
9196309
typo fixes
dsteinbrunner Aug 6, 2013
1d458ae
typo fixes
dsteinbrunner Aug 6, 2013
846f978
typo fix
dsteinbrunner Aug 6, 2013
fc3e60a
typo fix
dsteinbrunner Aug 6, 2013
5bb3df4
typo fix
dsteinbrunner Aug 6, 2013
42d0bc4
typo fix
dsteinbrunner Aug 6, 2013
29f1d90
typo fix
dsteinbrunner Aug 6, 2013
8d5b7fd
typo fix
dsteinbrunner Aug 6, 2013
2d3ab08
typo fixes
dsteinbrunner Aug 6, 2013
63e8493
typo fix
dsteinbrunner Aug 6, 2013
cd5faae
typo fix
dsteinbrunner Aug 6, 2013
c8c5f7d
typo fix
dsteinbrunner Aug 6, 2013
631e577
typo fixes
dsteinbrunner Aug 6, 2013
21fc4f6
more typo fixes
dsteinbrunner Aug 6, 2013
7933acd
typo fix
dsteinbrunner Aug 7, 2013
ff80553
typo fixes
dsteinbrunner Aug 7, 2013
44e80fb
typo fix
dsteinbrunner Aug 7, 2013
1cc390e
typo fixes
dsteinbrunner Aug 7, 2013
8241b1f
typo fix
dsteinbrunner Aug 7, 2013
fb34d88
typo fixes
dsteinbrunner Aug 7, 2013
d862768
typo fixes
dsteinbrunner Aug 7, 2013
c22c91b
typo fixes
dsteinbrunner Aug 7, 2013
2dc07b0
typo fixes
dsteinbrunner Aug 7, 2013
31cf6cd
typo fixes
dsteinbrunner Aug 7, 2013
fd83cbe
typo fix
dsteinbrunner Aug 7, 2013
dcda0c5
typo fix
dsteinbrunner Aug 7, 2013
8f6afec
typo fixes
dsteinbrunner Aug 7, 2013
560f247
typo fix
dsteinbrunner Aug 7, 2013
3ef921e
typo fixes
dsteinbrunner Aug 7, 2013
862019a
typo fixes
dsteinbrunner Aug 7, 2013
4062bd1
typo fixes
dsteinbrunner Aug 7, 2013
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Admin/Usage.pm
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ sub set_simple {



# This returns the usage formated as a pod document
# This returns the usage formatted as a pod document
sub pod {
my ($self) = @_;
return join qq{\n}, $self->pod_leader_text, $self->pod_option_text, $self->pod_authorlic_text;
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/CDBICompat/ColumnsAsHash.pm
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ See DBIx::Class::CDBICompat for usage directions.

=head1 DESCRIPTION

Emulates the I<undocumnted> behavior of Class::DBI where the object can be accessed as a hash of columns. This is often used as a performance hack.
Emulates the I<undocumented> behavior of Class::DBI where the object can be accessed as a hash of columns. This is often used as a performance hack.

my $column = $result->{column};

Expand Down
8 changes: 4 additions & 4 deletions lib/DBIx/Class/Manual/Cookbook.pod
Original file line number Diff line number Diff line change
Expand Up @@ -840,7 +840,7 @@ AKA multi-class object inflation from one table
L<DBIx::Class> classes are proxy classes, therefore some different
techniques need to be employed for more than basic subclassing. In
this example we have a single user table that carries a boolean bit
for admin. We would like like to give the admin users
for admin. We would like to give the admin users
objects (L<DBIx::Class::Row>) the same methods as a regular user but
also special admin only methods. It doesn't make sense to create two
separate proxy-class files for this. We would be copying all the user
Expand Down Expand Up @@ -1108,7 +1108,7 @@ as follows:

=head2 Filtering a relationship result set

If you want to get a filtered result set, you can just add add to $attr as follows:
If you want to get a filtered result set, you can just add to $attr as follows:

__PACKAGE__->has_many('pages' => 'Page', 'book', { where => { scrap => 0 } } );

Expand Down Expand Up @@ -1223,7 +1223,7 @@ building a renaming facility, like so:

1;

By overridding the L<connection|DBIx::Class::Schama/connection>
By overriding the L<connection|DBIx::Class::Schama/connection>
method and extracting a custom option from the provided \%attr hashref one can
then simply iterate over all the Schema's ResultSources, renaming them as
needed.
Expand Down Expand Up @@ -2190,7 +2190,7 @@ L<DBIx::Class|DBIx::Class> programs can have a significant startup delay
as the ORM loads all the relevant classes. This section examines
techniques for reducing the startup delay.

These tips are are listed in order of decreasing effectiveness - so the
These tips are listed in order of decreasing effectiveness - so the
first tip, if applicable, should have the greatest effect on your
application.

Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Manual/DocMap.pod
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ L<DBIx::Class::Row> and L<DBIx::Class::Relationship::Base> are used most often.

=item L<DBIx::Class::ResultSource> - Source/Table definition functions.

=item L<DBIx::Class::Schema> - Overall sourcess, and connection container.
=item L<DBIx::Class::Schema> - Overall sources, and connection container.

=item L<DBIx::Class::Relationship> - Simple relationship declarations.

Expand Down
6 changes: 3 additions & 3 deletions lib/DBIx/Class/Manual/FAQ.pod
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ lot later.

=item .. use DBIx::Class across multiple databases?

If your database server allows you to run querys across multiple
If your database server allows you to run queries across multiple
databases at once, then so can DBIx::Class. All you need to do is make
sure you write the database name as part of the
L<DBIx::Class::ResultSource/table> call. Eg:
Expand Down Expand Up @@ -314,7 +314,7 @@ Use L<DBIx::Class::Row/discard_changes>.

$result->discard_changes

Discarding changes and refreshing from storage are two sides fo the same coin. When you
Discarding changes and refreshing from storage are two sides of the same coin. When you
want to discard your local changes, just re-fetch the row from storage. When you want
to get a new, fresh copy of the row, just re-fetch the row from storage.
L<DBIx::Class::Row/discard_changes> does just that by re-fetching the row from storage
Expand Down Expand Up @@ -489,7 +489,7 @@ An another method is to use L<Moose> with your L<DBIx::Class> package.

__PACKAGE__->table('foo'); # etc

With either of these methods the resulting use of the accesssor would be
With either of these methods the resulting use of the accessor would be

my $result;

Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/Manual/Features.pod
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ See L<DBIx::Class::Schema::Loader> and L<DBIx::Class::Schema::Loader::Base/CONST

=head2 Populate

Made for inserting lots of rows very quicky into database
Made for inserting lots of rows very quickly into database

$schema->populate([ Users =>
[qw( username password )],
Expand Down Expand Up @@ -605,7 +605,7 @@ L<DBIx::Class::ResultSet/group_by>

=over 1

=item Careful, get_column can basicaly mean B<three> things
=item Careful, get_column can basically mean B<three> things

=item private in which case you should use an accessor

Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Manual/Intro.pod
Original file line number Diff line number Diff line change
Expand Up @@ -420,7 +420,7 @@ similarity ends. Any time you call a CRUD operation on a row (e.g.
L<delete|DBIx::Class::Row/delete>,
L<update|DBIx::Class::Row/update>,
L<discard_changes|DBIx::Class::Row/discard_changes>,
etc.) DBIx::Class will use the values of of the
etc.) DBIx::Class will use the values of the
L<primary key|DBIx::Class::ResultSource/set_primary_key> columns to populate
the C<WHERE> clause necessary to accomplish the operation. This is why it is
important to declare a L<primary key|DBIx::Class::ResultSource/set_primary_key>
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Optional/Dependencies.pm
Original file line number Diff line number Diff line change
Expand Up @@ -897,7 +897,7 @@ EOD
'=item Return Value: \%list_of_loaderrors_per_module',
'=back',
<<'EOD',
Returns a hashref containing the actual errors that occured while attempting
Returns a hashref containing the actual errors that occurred while attempting
to load each module in the requirement group.
EOD
'=head1 AUTHOR',
Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/Relationship/Base.pm
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,7 @@ hashref which does not depend on joins being available, but the hashref must
contain only plain values/deflatable objects, such that the result can be
passed directly to L<DBIx::Class::Relationship::Base/set_from_related>. For
instance the C<year> constraint in the above example prevents the relationship
from being used to to create related objects (an exception will be thrown).
from being used to create related objects (an exception will be thrown).

In order to allow the user to go truly crazy when generating a custom C<ON>
clause, the C<$args> hashref passed to the subroutine contains some extra
Expand Down Expand Up @@ -297,7 +297,7 @@ For a 'belongs_to relationship, note the 'cascade_update':
=item \%column

A hashref where each key is the accessor you want installed in the main class,
and its value is the name of the original in the fireign class.
and its value is the name of the original in the foreign class.

MyApp::Schema::Track->belongs_to( cd => 'DBICTest::Schema::CD', 'cd', {
proxy => { cd_title => 'title' },
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Relationship/BelongsTo.pm
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ our %_pod_inherit_config =
sub belongs_to {
my ($class, $rel, $f_class, $cond, $attrs) = @_;

# assume a foreign key contraint unless defined otherwise
# assume a foreign key constraint unless defined otherwise
$attrs->{is_foreign_key_constraint} = 1
if not exists $attrs->{is_foreign_key_constraint};
$attrs->{undef_on_null_fk} = 1
Expand Down
16 changes: 8 additions & 8 deletions lib/DBIx/Class/ResultSet.pm
Original file line number Diff line number Diff line change
Expand Up @@ -1440,7 +1440,7 @@ sub _construct_results {
: 'classic_nonpruning'
;

# $args and $attrs to _mk_row_parser are seperated to delineate what is
# $args and $attrs to _mk_row_parser are separated to delineate what is
# core collapser stuff and what is dbic $rs specific
@{$self->{_row_parser}{$parser_type}}{qw(cref nullcheck)} = $rsrc->_mk_row_parser({
eval => 1,
Expand All @@ -1456,7 +1456,7 @@ sub _construct_results {
# can't work without it). Add an explicit check for the *main*
# result, hopefully this will gradually weed out such errors
#
# FIXME - this is a temporary kludge that reduces perfromance
# FIXME - this is a temporary kludge that reduces performance
# It is however necessary for the time being
my ($unrolled_non_null_cols_to_check, $err);

Expand Down Expand Up @@ -2323,7 +2323,7 @@ sub populate {
}


# populate() argumnets went over several incarnations
# populate() arguments went over several incarnations
# What we ultimately support is AoH
sub _normalize_populate_args {
my ($self, $arg) = @_;
Expand Down Expand Up @@ -2497,7 +2497,7 @@ sub _merge_with_rscond {
);
}
else {
# precendence must be given to passed values over values inherited from
# precedence must be given to passed values over values inherited from
# the cond, so the order here is important.
my $collapsed_cond = $self->_collapse_cond($self->{cond});
my %implied = %{$self->_remove_alias($collapsed_cond, $alias)};
Expand Down Expand Up @@ -2532,7 +2532,7 @@ sub _merge_with_rscond {
# determines if the resultset defines at least one
# of the attributes supplied
#
# used to determine if a subquery is neccessary
# used to determine if a subquery is necessary
#
# supports some virtual attributes:
# -join
Expand Down Expand Up @@ -3593,7 +3593,7 @@ sub _resolved_attrs {
}

# run through the resulting joinstructure (starting from our current slot)
# and unset collapse if proven unnesessary
# and unset collapse if proven unnecessary
#
# also while we are at it find out if the current root source has
# been premultiplied by previous related_source chaining
Expand Down Expand Up @@ -4219,7 +4219,7 @@ object with all of its related data.
If an L</order_by> is already declared, and orders the resultset in a way that
makes collapsing as described above impossible (e.g. C<< ORDER BY
has_many_rel.column >> or C<ORDER BY RANDOM()>), DBIC will automatically
switch to "eager" mode and slurp the entire resultset before consturcting the
switch to "eager" mode and slurp the entire resultset before constructing the
first object returned by L</next>.

Setting this attribute on a resultset that does not join any has_many
Expand Down Expand Up @@ -4643,7 +4643,7 @@ or to a sensible value based on the "data_type".
=item dbic_colname

Used to fill in missing sqlt_datatype and sqlt_size attributes (if they are
explicitly specified they are never overriden). Also used by some weird DBDs,
explicitly specified they are never overridden). Also used by some weird DBDs,
where the column name should be available at bind_param time (e.g. Oracle).

=back
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/ResultSource.pm
Original file line number Diff line number Diff line change
Expand Up @@ -1664,7 +1664,7 @@ our $UNRESOLVABLE_CONDITION = \ '1 = 0';

# Resolves the passed condition to a concrete query fragment and a flag
# indicating whether this is a cross-table condition. Also an optional
# list of non-triviail values (notmally conditions) returned as a part
# list of non-trivial values (normally conditions) returned as a part
# of a joinfree condition hash
sub _resolve_condition {
my ($self, $cond, $as, $for, $rel_name) = @_;
Expand Down
6 changes: 3 additions & 3 deletions lib/DBIx/Class/ResultSource/RowParser.pm
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ sub _resolve_prefetch {
# any sort of adjustment/rewrite should be relatively easy (fsvo relatively)
#
sub _mk_row_parser {
# $args and $attrs are seperated to delineate what is core collapser stuff and
# $args and $attrs are separated to delineate what is core collapser stuff and
# what is dbic $rs specific
my ($self, $args, $attrs) = @_;

Expand Down Expand Up @@ -243,7 +243,7 @@ sub _resolve_collapse {
if $args->{_parent_info}{collapser_reusable};
}

# Still dont know how to collapse - try to resolve based on our columns (plus already inserted FK bridges)
# Still don't know how to collapse - try to resolve based on our columns (plus already inserted FK bridges)
if (
! $collapse_map->{-identifying_columns}
and
Expand Down Expand Up @@ -364,7 +364,7 @@ sub _resolve_collapse {
# if we got here - we are good to go, but the construction is tricky
# since our children will want to include our collapse criteria - we
# don't give them anything (safe, since they are all collapsible on their own)
# in addition we record the individual collapse posibilities
# in addition we record the individual collapse possibilities
# of all left children node collapsers, and merge them in the rowparser
# coderef later
$collapse_map->{-identifying_columns} = [];
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Row.pm
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ with NULL as the default, and save yourself a SELECT.
=cut

## It needs to store the new objects somewhere, and call insert on that list later when insert is called on this object. We may need an accessor for these so the user can retrieve them, if just doing ->new().
## This only works because DBIC doesnt yet care to check whether the new_related objects have been passed all their mandatory columns
## This only works because DBIC doesn't yet care to check whether the new_related objects have been passed all their mandatory columns
## When doing the later insert, we need to make sure the PKs are set.
## using _relationship_data in new and funky ways..
## check Relationship::CascadeActions and Relationship::Accessor for compat
Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/SQLMaker.pm
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@ sub _recurse_fields {
# What we have been doing forever is hijacking the $order arg of
# SQLA::select to pass in arbitrary pieces of data (first the group_by,
# then pretty much the entire resultset attr-hash, as more and more
# things in the SQLA space need to have mopre info about the $rs they
# things in the SQLA space need to have more info about the $rs they
# create SQL for. The alternative would be to keep expanding the
# signature of _select with more and more positional parameters, which
# is just gross. All hail SQLA2!
Expand All @@ -288,7 +288,7 @@ sub _parse_rs_attrs {
my $sql = '';

if ($arg->{group_by}) {
# horible horrible, waiting for refactor
# horrible, waiting for refactor
local $self->{select_bind};
if (my $g = $self->_recurse_fields($arg->{group_by}) ) {
$sql .= $self->_sqlcase(' group by ') . $g;
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/SQLMaker/ACCESS.pm
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ sub _recurse_from {
$fin_join = sprintf '( %s ) %s', $fin_join, (shift @j);
}

# the entire FROM is *ALSO* expected aprenthesized
# the entire FROM is *ALSO* expected parenthesized
"( $fin_join )";
}

Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/SQLMaker/Oracle.pm
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ sub _order_siblings_by {
return wantarray ? ( $sql, @bind ) : $sql;
}

# we need to add a '=' only when PRIOR is used against a column diretly
# we need to add a '=' only when PRIOR is used against a column directly
# i.e. when it is invoked by a special_op callback
sub _where_field_PRIOR {
my ($self, $lhs, $op, $rhs) = @_;
Expand Down Expand Up @@ -177,7 +177,7 @@ sub _shorten_identifier {
}
}

# still too long - just start cuting proportionally
# still too long - just start cutting proportionally
if ($concat_len > $max_trunc) {
my $trim_ratio = $max_trunc / $concat_len;

Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Schema.pm
Original file line number Diff line number Diff line change
Expand Up @@ -1000,7 +1000,7 @@ sub svp_rollback {

Clones the schema and its associated result_source objects and returns the
copy. The resulting copy will have the same attributes as the source schema,
except for those attributes explicitly overriden by the provided C<%attrs>.
except for those attributes explicitly overridden by the provided C<%attrs>.

=cut

Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Schema/Versioned.pm
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ Downgrades in addition to upgrades

=item *

Multiple sql files files per upgrade/downgrade/install
Multiple sql files per upgrade/downgrade/install

=item *

Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/Storage/BlockRunner.pm
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ sub run {

# this is the actual recursing worker
sub _run {
# warnings here mean I did not anticipate some ueber-complex case
# warnings here mean I did not anticipate some uber-complex case
# fatal warnings are not warranted
no warnings;
use warnings;
Expand Down Expand Up @@ -208,7 +208,7 @@ sub _run {

$storage->ensure_connected;
# if txn_depth is > 1 this means something was done to the
# original $dbh, otherwise we would not get past the preceeding if()
# original $dbh, otherwise we would not get past the preceding if()
$storage->throw_exception(sprintf
'Unexpected transaction depth of %d on freshly connected handle',
$storage->transaction_depth,
Expand Down
4 changes: 2 additions & 2 deletions lib/DBIx/Class/Storage/DBI.pm
Original file line number Diff line number Diff line change
Expand Up @@ -792,7 +792,7 @@ sub dbh_do {

# short circuit when we know there is no need for a runner
#
# FIXME - asumption may be wrong
# FIXME - assumption may be wrong
# the rationale for the txn_depth check is that if this block is a part
# of a larger transaction, everything up to that point is screwed anyway
return $self->$run_target($self->_get_dbh, @_)
Expand Down Expand Up @@ -2938,7 +2938,7 @@ sub deployment_statements {
$self->throw_exception("Can't deploy without a ddl_dir or " . DBIx::Class::Optional::Dependencies->req_missing_for ('deploy') );
}

# sources needs to be a parser arg, but for simplicty allow at top level
# sources needs to be a parser arg, but for simplicity allow at top level
# coming in
$sqltargs->{parser_args}{sources} = delete $sqltargs->{sources}
if exists $sqltargs->{sources};
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Storage/DBI/ADO/Microsoft_SQL_Server.pm
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ sub bind_attribute_by_data_type {
}

# FIXME This list is an abomination. We need a way to do this outside
# of the scope of DBIC, as as it is right now nobody will ever think to
# of the scope of DBIC, as it is right now nobody will ever think to
# even look here to diagnose some sort of misbehavior.
sub _mssql_max_data_type_representation_size_in_bytes {
my $self = shift;
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Storage/DBI/AutoCast.pm
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ sub _prep_for_execute {
my ($sql, $bind) = $self->next::method (@_);

# If we're using ::NoBindVars, there are no binds by this point so this code
# gets skippeed.
# gets skipped.
if ($self->auto_cast && @$bind) {
my $new_sql;
my @sql_part = split /\?/, $sql, scalar @$bind + 1;
Expand Down
2 changes: 1 addition & 1 deletion lib/DBIx/Class/Storage/DBI/IdentityInsert.pm
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ toggles like:
=cut

# SET IDENTITY_X only works as part of a statement scope. We can not
# $dbh->do the $sql and the wrapping set()s individualy. Hence the
# $dbh->do the $sql and the wrapping set()s individually. Hence the
# sql mangling. The newlines are important.
sub _prep_for_execute {
my $self = shift;
Expand Down