I did random creation and I was really lucky that the first three characters immediately formed a good base for a standard RPG party, although they are still lacking some useful abilities (like melee). Please note that I don't play for a challenge but for a narration, so I play in easy mode and with some house rules to make things more reasonable (like having starting characters have improvised daggers and any equipment that is unreasonable not to have with their skills). But I have finished setup and I am very pleased with character creation. ![]() I haven't really started playing because I have been busy at work and in my free time there have been too many distraction. Keep your eyes on this site for futher updates and happy SCRAWLing! This is still a beta version - feedback will be welcome and I hope to get all of the cover art, at least, to be specially made for SCRAWL. You can make the game as simple or as complicated as you like - you can create a party just to explore a dungeon, or you could attempt solo gamebooks (which will be released in the future) or you could do the whole kerboodle and generate an overland populated with all kinds of places which you can keep records of and explore to your heart's content. This edition is similar to the ultimate edition except it now has all of my ideas bundled up into one little package. There are terrain generators, site and settlement generators and dungeon generators for a large variety of terrains, settlements and dungeons. This is the revised edition which has 8 books which enable you to take a party of up to 4 characters to explore the overland, stay in settlements and then loot dungeons. This is 100%, good old fashioned, completely unapolagetic murder-hobo crawling. ![]() SCRAWL is short for Solo Crawl is a solo fantasy game system which encourages old style crawling around wildernesses, cities and dungeons, exploring strange new locations, looking for treasure, meeting interesting creatures and slaying them. Welcome to SCRAWL: Revised edition (Beta). There is SCRAWL: Ultimate edition which I am no longer making products for, but you are free to download it for PWYW. Please note that this is the latest version of SCRAWL. If you have money to spare, use it to help the people in Ukraine instead please. Sgmllib.EDIT: I've changed this to free, not PWYW. "unexpected %r char in declaration" % rawdata)įile "/home/scraper/.fakeroot/lib/python2.7/sgmllib.py", line 111, in error Links = self._extract_links(body, response.url, response.encoding, base_url)įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/linkextractors/sgml.py", line 29, in _extract_linksįile "/home/scraper/.fakeroot/lib/python2.7/sgmllib.py", line 104, in feedįile "/home/scraper/.fakeroot/lib/python2.7/sgmllib.py", line 174, in goaheadįile "/home/scraper/.fakeroot/lib/python2.7/markupbase.py", line 140, in parse_declaration Links = įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/linkextractors/sgml.py", line 128, in extract_links Return (r for r in result or () if _filter(r))įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/depth.py", line 50, in įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spiders/crawl.py", line 73, in _parse_responseįor request_or_item in self._requests_to_follow(response):įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spiders/crawl.py", line 52, in _requests_to_follow Return (_set_referer(r) for r in result or ())įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/urllength.py", line 33, in Work = (callable(elem, *args, **named) for elem in iterable)įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/utils/defer.py", line 96, in iter_errbackįile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/offsite.py", line 23, in process_spider_outputįile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/referer.py", line 22, in I'm getting the following error: Traceback (most recent call last):įile "/home/scraper/.fakeroot/lib/python2.7/site-packages/twisted/internet/base.py", line 824, in runUntilCurrentįile "/home/scraper/.fakeroot/lib/python2.7/site-packages/twisted/internet/task.py", line 638, in _tickįile "/home/scraper/.fakeroot/lib/python2.7/site-packages/twisted/internet/task.py", line 484, in _oneWorkUnitįile "/home/scraper/.fakeroot/lib/python2.7/site-packages/scrapy/utils/defer.py", line 57, in Rule(SgmlLinkExtractor(allow=r'.*?productId.*'), callback='parse_item'), Rule(SgmlLinkExtractor(allow=r'.*?categoryId.*'), follow=True), I want to open the urls that contain "product" and then pass that through to a callback. Say I want to scrape a site and I want it to go through links that contain "category". ![]() I don't understand how scrapy rules work.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |