1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Submit 3.5

Discussion in 'Plugins' started by Nick, Aug 1, 2009.

  1. petsagouris

    petsagouris Design & Development

    Use tags for that.
  2. Nick

    Nick Well-Known Member

    I don't know about grabbing YouTube descriptions specifically, but you could write a function to grab the meta description from a page.

  3. chizzle

    chizzle New Member

    My site is about music, I'd want the user to put what a post sounds like, if it's a remix it'd sound like pop/hiphop. I'm using tags to list the title of the songs.
  4. chizzle

    chizzle New Member

    Last edited by a moderator: Nov 25, 2010
  5. petsagouris

    petsagouris Design & Development

    Eventhough tags can be used for anything you want, this sounds a little off. Can you give me an example of such submission ?

    BTW: Checked the database tables and the code. It is not possible to put a post in two categories, you would have to re-submit it in another category. (On the other hand tags are there for that kind of mode of operation as I told you before)
  6. echocron

    echocron New Member

    Any chance of capturing the description of a page from the meta tags on a submitted url?
    Last edited: Dec 7, 2010
  7. Nick

    Nick Well-Known Member

    Sorry, I have no plans to add that feature into this plugin.

  8. mystermarque

    mystermarque New Member

    Hey guys,

    Would anyone know how to get Submit step 2 to check for and silently kill duplicate tags?

    It's an easy mistake for anyone to make and pulls a big ol' error when you do:

    Warning: Duplicate entry '294-TAG_NAME-1' for key 'tags_post_id' in /****/****/****/****.com/libs/extensions/ezSQL/mysql/ez_sql_mysql.php on line 264

    It still submits the post, but lists multiple tag duplicates on the post page. Probably not a good thing.

    Thanks in advance,
  9. mystermarque

    mystermarque New Member

    echocron likes this.
  10. echocron

    echocron New Member

    mystermarque - I appreciate the link. I tried it - it was easy and it works great! Thanks!
  11. Gunaxin

    Gunaxin Well-Known Member

  12. Terko

    Terko New Member

    I have similar problems. Some of the articles cannot be submitted. For example:
    Other are submitted, but they URL addresses are cutted and this leads to problems - the URLs are not valid.
    Specially when they are written on Cyrillic - they become very longer and maybe some special characters are not accepted.
  13. Terko

    Terko New Member

  14. mystermarque

    mystermarque New Member

    Hey guys,

    I've got two unscrapeable URLs here also:
    This one pulls the same error as Terko's news.ibox.bg above (red error message "Nothing submitted..." and the URL form field alone underneath like submit step 1)
    I thought maybe it was because this example is in frames, but the news.ibox.bg site is not and has the same error.

    Also, the following address returns an entirely blank submit page 2 form like Gunaxin's example ("No Title Found" in title and no images or my clever description or tag scrape either.)

    Would anyone have any ideas? The metadata for all examples is present in their respective source codes (and in a standard fashion), just can't seem to grab it.
    Last edited: Feb 15, 2011
  15. Terko

    Terko New Member

    I've managed to fix one of the problems. Go to PHPMYADMIN and find posts, then edit the post_orig_url. It's Varchar 255. I've increased it to 450 and one of the longest URLs that are in Cyrillic and urlencoded becomes very, very long started to work.
    For example see how long is this URL
    It's not so long when in Cyrillic, but when URLencoded it become huge.
    I didn't tested the other example with news.ibox.bg yet, but I will try now.
    I've tested the url
    and it continue to not work. Nothing posted again.
    I made one attempt to see where is the problem. If I change _ to - in the address I can go to Step 2. So the problem is with "_" underscore. I don't know how to fix this.
    The good news is that with my Fix the article in Maxim also started to work :)
    End of edit
    Last edited: Feb 16, 2011
  16. mystermarque

    mystermarque New Member

    Terko, thanks for attempting to solve this problem. Your fix only works (as expected) on your super-long Cyrillic link, but is something we should all adjust to be prepared for long URLs.

    With the remaining links, we actually seem to have two problems here:

    The Maxim, news.ibox.bg and discountdance links seem to have URL problems. The scraper cannot recognize the special characters (_ +) in post/product pages - but if you try their 404 pages or homepages with no-nonsense urls, it works fine.

    Neither the product pages OR HOMEPAGES at Anjolee and GetStronger fetch any data however - indicating some other problem.

    Anyone else have any ideas?
  17. scrt

    scrt New Member

    Envir: submit 3.2, Hotaru 1.4.2 , Bug Priority low-medium

    tag field, I can enter the same tag multiple time for a post. Example, tag field, I enter aa, aa, aa, aa

    Those 4 aa are place into the Db. Data entry check for same or duplicate tag - aa, aa, aa, aa. Strip to avoid dup.

    Edit Mystermargue: Do not expect a solution base on the priority level I assign. I saw after I post to see your mention. Left it to say it exist and it is written to DB. Also, u had warning. I did not get warning to thus say it is different. Strip dup routine on tags needed. tx
    Last edited: Mar 14, 2011
  18. mystermarque

    mystermarque New Member

  19. sven

    sven New Member

    Your problems are related to the way Hotaru checks the title. Specifically, it's unable to parse these weird titles. Add the following line to SubmitFunctions.php above the line saying return $title at line no. 775 to alleviate the problem:
    $title = mb_convert_encoding($title, 'HTML-ENTITIES', "UTF-8");​
    mystermarque likes this.
  20. rpermana

    rpermana Donor Donor


    I have few issue here, most of my members submit their main URL instead of blog post URL.
    Any ways to prevent this ?

Share This Page